Neural networks are one of the most powerful machine structures that in recent years due to the increasing computing power of processors has found numerous applications in various scientific and industrial fields. One of the features of these algorithms is that it is possible to construct any derivative function of the desired acceptance by using only two hidden layers of neurons. Therefore, it is possible to implement other functions with its help. Due to the subsequent advances of this structure and its increasing application in the industry, very strong frameworks have been offered to facilitate the creation of very complex and highly efficient neural networks. In this project we intend to use the TensorFlow library and the Keras high-level interface to address some network issues. In this problem, we intend to use neural networks based on the image of an animal to identify the type of that animal. Wildlife Database is a wildlife image dataset. In this exercise, we will train a neural network with 4 classes of Elk, Raccoon, Raven and Bald Eagle from this data set.
import numpy as np
import pandas as pd
import tensorflow as tf
from tensorflow import keras
from tensorflow.keras import *
import matplotlib.pyplot as plt
from sklearn.metrics import classification_report
from google.colab import drive
drive.mount('/content/drive')
Mounted at /content/drive
!cp "/content/drive/MyDrive/Term 5/AI/CA5-P2/dataset.zip" .
!unzip "/content/drive/MyDrive/Term 5/AI/CA5-P2/dataset.zip"
Archive: /content/drive/MyDrive/Term 5/AI/CA5-P2/dataset.zip creating: dataset/test/ creating: dataset/test/bald_eagle/ inflating: dataset/test/bald_eagle/023396cb25b3c3b42b.jpg inflating: dataset/test/bald_eagle/051341685b28ada745.jpeg inflating: dataset/test/bald_eagle/056c89e037e83fea0b.jpg inflating: dataset/test/bald_eagle/05b4bbcee0cac47380.jpg inflating: dataset/test/bald_eagle/0781c8c5da9a997d50.jpg inflating: dataset/test/bald_eagle/0a6bf3fa0a0d17aed4.jpg inflating: dataset/test/bald_eagle/0affaa5e5d0cd767f9.jpg inflating: dataset/test/bald_eagle/0c9f5bd60c204a39a6.jpg inflating: dataset/test/bald_eagle/0ca38898bfbf707dc5.jpg inflating: dataset/test/bald_eagle/0cfacfed56630d7875.jpg inflating: dataset/test/bald_eagle/0d11cf0159847e90be.jpg inflating: dataset/test/bald_eagle/0d794c49513ad0fdea.jpg inflating: dataset/test/bald_eagle/0ef2b2fb71abab4e2e.png inflating: dataset/test/bald_eagle/0f31be490a342c1de0.jpg inflating: dataset/test/bald_eagle/10fb6f767bbb388422.jpg inflating: dataset/test/bald_eagle/135e13c37b761c1e38.jpg inflating: dataset/test/bald_eagle/145b18313ac5c273f8.jpg inflating: dataset/test/bald_eagle/15116fffcb97f2c1a6.jpg inflating: dataset/test/bald_eagle/15db85ea3d90eb810d.jpg inflating: dataset/test/bald_eagle/169253e3295f4edcee.jpg inflating: dataset/test/bald_eagle/1749f06bd4714e3946.jpg inflating: dataset/test/bald_eagle/17c39be32914a26725.jpg inflating: dataset/test/bald_eagle/186b687119c32d6d55.jpg inflating: dataset/test/bald_eagle/1925089786b1b8d082.png inflating: dataset/test/bald_eagle/19a52cfa632b082c86.jpg inflating: dataset/test/bald_eagle/1b3e659d2fb981459f.jpg inflating: dataset/test/bald_eagle/1b5713c4047e91a1d7.jpg inflating: dataset/test/bald_eagle/1b745bacb24f389c27.jpg inflating: dataset/test/bald_eagle/1b761d78c66d578959.jpg inflating: dataset/test/bald_eagle/1caa62a2f9ec8e2fbc.jpg inflating: dataset/test/bald_eagle/1cbf9c90bcb6026e96.jpg inflating: dataset/test/bald_eagle/1ceaa1d069ce54bf41.jpg inflating: dataset/test/bald_eagle/1d150e26a684f168c5.jpg inflating: dataset/test/bald_eagle/1e906d71b30c31d0e0.jpg inflating: dataset/test/bald_eagle/1f432bc9132226626c.jpg inflating: dataset/test/bald_eagle/1fc85996591623fd78.jpg inflating: dataset/test/bald_eagle/1fca626b9dca7f0780.PNG inflating: dataset/test/bald_eagle/1fdf0aab73a55394b0.jpg inflating: dataset/test/bald_eagle/218add5d92f39dbc1d.jpg inflating: dataset/test/bald_eagle/23b3a7cea8dd683280.jpg inflating: dataset/test/bald_eagle/24eee320caa5779680.jpg inflating: dataset/test/bald_eagle/26bf782961d0b3a3ec.jpg inflating: dataset/test/bald_eagle/26dd1da19d8a8d535d.jpg inflating: dataset/test/bald_eagle/278301248404ed6866.jpg inflating: dataset/test/bald_eagle/2986db773f9af208f1.jpg inflating: dataset/test/bald_eagle/2a4611a8c921ae9661.jpg inflating: dataset/test/bald_eagle/2e7dbe3cd798737e0f.jpg inflating: dataset/test/bald_eagle/2e809554b1339f2d2c.jpg inflating: dataset/test/bald_eagle/2ef6c5dd3bde18527c.jpg inflating: dataset/test/bald_eagle/2f986ee7a02e7b11aa.jpg inflating: dataset/test/bald_eagle/30c3057f41fb0c6812.jpg inflating: dataset/test/bald_eagle/316b73c21de2fa0591.jpg inflating: dataset/test/bald_eagle/31ae277fac59957461.jpg inflating: dataset/test/bald_eagle/33fa92cdcf6dd82691.jpg inflating: dataset/test/bald_eagle/3442388049c2e8bb0b.jpg inflating: dataset/test/bald_eagle/3754bc8ab2306f66c8.jpg inflating: dataset/test/bald_eagle/37c623a1b493ff4895.jpg inflating: dataset/test/bald_eagle/398903df7712477a92.jpg inflating: dataset/test/bald_eagle/398df5e294ba9d7d9f.jpg inflating: dataset/test/bald_eagle/3bd66e706d04b4668b.jpg inflating: dataset/test/bald_eagle/3ed200dd3b1eb499a1.jpg inflating: dataset/test/bald_eagle/3ee3c2efbaa6c95dce.jpg inflating: dataset/test/bald_eagle/42957be7c5c489963f.jpg inflating: dataset/test/bald_eagle/453599423af9e72510.jpg inflating: dataset/test/bald_eagle/45babc24954d8f7dbd.jpg inflating: dataset/test/bald_eagle/46bff4a1f390a8ff7e.jpg inflating: dataset/test/bald_eagle/487182e1f1fdab6271.jpeg inflating: dataset/test/bald_eagle/498b9e364db89550e6.jpg inflating: dataset/test/bald_eagle/49c65b8a87ebd363c5.jpeg inflating: dataset/test/bald_eagle/4a0ec3ca597de5d3ed.png inflating: dataset/test/bald_eagle/4af7582eb2b025941a.jpg inflating: dataset/test/bald_eagle/4b0598098bda1e8889.jpg inflating: dataset/test/bald_eagle/4b4727b7551c8f391d.jpg inflating: dataset/test/bald_eagle/4d9f23a7d9b98047d4.jpg inflating: dataset/test/bald_eagle/4e32657801a2021672.jpg inflating: dataset/test/bald_eagle/4e83d0ebf32c303637.jpg inflating: dataset/test/bald_eagle/4eda6241ed1383d84c.jpg inflating: dataset/test/bald_eagle/508fa6c79a91152c2f.jpg inflating: dataset/test/bald_eagle/50e412d663bd2fd43f.jpg inflating: dataset/test/bald_eagle/52b528ce7bd0e4be79.gif inflating: dataset/test/bald_eagle/541c7a965b9890b249.jpg inflating: dataset/test/bald_eagle/549ad3ddd782b9c5e9.jpg inflating: dataset/test/bald_eagle/550649d98c89e0b8ab.jpg inflating: dataset/test/bald_eagle/578da7362dada6e9e3.jpg inflating: dataset/test/bald_eagle/5a3d8c6c12f48b6eea.jpg inflating: dataset/test/bald_eagle/5aac8789d9d4c6e327.jpg inflating: dataset/test/bald_eagle/5c94a264f6d310b01e.jpg inflating: dataset/test/bald_eagle/5d9a8a852d8416772e.jpg inflating: dataset/test/bald_eagle/5e03be10ac49f47973.jpg inflating: dataset/test/bald_eagle/626d09c626e65accf7.jpg inflating: dataset/test/bald_eagle/643387dfb1637bb0a1.jpg inflating: dataset/test/bald_eagle/6563ea32e0bad76949.jpg inflating: dataset/test/bald_eagle/65be7aca3b8b7865f4.jpg inflating: dataset/test/bald_eagle/683e37083083f4d379.jpg inflating: dataset/test/bald_eagle/687d28d8774853ba61.jpg inflating: dataset/test/bald_eagle/6a0642f409af9f2f2d.jpg inflating: dataset/test/bald_eagle/6a9aacb10c67b35a9e.jpg inflating: dataset/test/bald_eagle/6dfb5dbd085fd9afa4.jpg inflating: dataset/test/bald_eagle/6e65901512ca7902c8.png inflating: dataset/test/bald_eagle/6ea4b7aca6fe43b144.jpg inflating: dataset/test/bald_eagle/6ede8624f87876f640.jpg inflating: dataset/test/bald_eagle/708e2383a7f9b50925.jpg inflating: dataset/test/bald_eagle/70cc37b199929d66c2.jpg inflating: dataset/test/bald_eagle/723657599b6a137263.jpg inflating: dataset/test/bald_eagle/7255564647b1f3c477.jpg inflating: dataset/test/bald_eagle/7334630d6aaad607d4.jpg inflating: dataset/test/bald_eagle/739390e3b83eda7efa.jpg inflating: dataset/test/bald_eagle/7740fc8a27e2fa9032.jpg inflating: dataset/test/bald_eagle/77e3b8664fd2905d2c.jpg inflating: dataset/test/bald_eagle/79255a7ed1f6996576.jpg inflating: dataset/test/bald_eagle/7a43eaa4876fecad3d.jpg inflating: dataset/test/bald_eagle/7a71fd69ef9f97e8c0.jpg inflating: dataset/test/bald_eagle/7ab8a1ca394ccc9ed0.png inflating: dataset/test/bald_eagle/7b419d86e4884c7baf.png inflating: dataset/test/bald_eagle/7bd5c07b235b544a97.jpg inflating: dataset/test/bald_eagle/7ef69f3c70c2add4de.jpg inflating: dataset/test/bald_eagle/7f0b10352407970d9d.jpg inflating: dataset/test/bald_eagle/7f2f4f5e36c9461275.jpg inflating: dataset/test/bald_eagle/7ff74c3511911567a3.jpg inflating: dataset/test/bald_eagle/7ffac88662f224fd47.jpg inflating: dataset/test/bald_eagle/80ccf491cdce555b59.jpg inflating: dataset/test/bald_eagle/81ab451637654fb844.jpg inflating: dataset/test/bald_eagle/824684e5dc20c9dd2e.jpg inflating: dataset/test/bald_eagle/8447318d54c0b848a6.jpg inflating: dataset/test/bald_eagle/84f5d80b4abd641cc0.jpg inflating: dataset/test/bald_eagle/86ce62d873dfe6de81.jpg inflating: dataset/test/bald_eagle/86f9e6a8b365c3da12.jpg inflating: dataset/test/bald_eagle/87780a42b40d1abf54.jpg inflating: dataset/test/bald_eagle/894f168264a341081e.jpg inflating: dataset/test/bald_eagle/89c281101c4bdb6ec8.jpg inflating: dataset/test/bald_eagle/8d9efd41b5cf8b9c7d.jpg inflating: dataset/test/bald_eagle/8ebecc1893cf2324f2.jpg inflating: dataset/test/bald_eagle/90513dc96e3f987410.jpg inflating: dataset/test/bald_eagle/92634d5a91c629d249.jpg inflating: dataset/test/bald_eagle/92a639a5312bbc515f.jpg inflating: dataset/test/bald_eagle/942c00c16f35c6d550.jpeg inflating: dataset/test/bald_eagle/9435c114f438135c39.png inflating: dataset/test/bald_eagle/95da76ff114bb3ae50.jpg inflating: dataset/test/bald_eagle/97291e51750ca61110.jpg inflating: dataset/test/bald_eagle/97749e54829c08e4c4.jpg inflating: dataset/test/bald_eagle/97d6fab6087000fb3a.jpg inflating: dataset/test/bald_eagle/98a733f1f38ea0ecdb.jpg inflating: dataset/test/bald_eagle/9c165e4b526b98e8f7.jpg inflating: dataset/test/bald_eagle/9c220bbf5468e03fcb.jpg inflating: dataset/test/bald_eagle/9e53306b1b3891d02e.jpg inflating: dataset/test/bald_eagle/9eeea8e2263f59ccfe.jpg inflating: dataset/test/bald_eagle/a090a1258f52c0dc0e.jpg inflating: dataset/test/bald_eagle/a0d066b1d13e171e73.gif inflating: dataset/test/bald_eagle/a19c2082d800c75b1f.jpg inflating: dataset/test/bald_eagle/a5a62a4931258095b9.jpg inflating: dataset/test/bald_eagle/a62f33a58036030608.jpg inflating: dataset/test/bald_eagle/a6f7c0039f3a4931b1.jpg inflating: dataset/test/bald_eagle/a8f8f114e9c801f65c.jpg inflating: dataset/test/bald_eagle/ab63dd8ec9ca61ca49.jpg inflating: dataset/test/bald_eagle/aba19e3d8fe14a9035.jpg inflating: dataset/test/bald_eagle/acb8b9e930309600e6.jpg inflating: dataset/test/bald_eagle/adcda5a69eb083b6cd.jpg inflating: dataset/test/bald_eagle/aef242380c54bb0c37.jpg inflating: dataset/test/bald_eagle/b074701b46abb7a16e.jpg inflating: dataset/test/bald_eagle/b0bbfa6dfd3ac6bd15.jpg inflating: dataset/test/bald_eagle/b2c8c1e147f84e6686.jpg inflating: dataset/test/bald_eagle/b39ee16725ecca2830.png inflating: dataset/test/bald_eagle/b5bb578e22933d6a78.jpg inflating: dataset/test/bald_eagle/b67dbbb0bdd9e21701.jpg inflating: dataset/test/bald_eagle/b6cb93a099c886ecfa.jpg inflating: dataset/test/bald_eagle/b94c6b601db92dc20c.jpg inflating: dataset/test/bald_eagle/ba0f91e95274e5f19d.jpg inflating: dataset/test/bald_eagle/bb0e9629387f5f9cae.jpg inflating: dataset/test/bald_eagle/bcf88cc5ece7e1a0bf.jpg inflating: dataset/test/bald_eagle/bd8a44ee6b3c49eaf6.jpg inflating: dataset/test/bald_eagle/be39096653688e1ad9.jpg inflating: dataset/test/bald_eagle/bef7f6634cb04968e2.jpg inflating: dataset/test/bald_eagle/c2c3e60e98003dfc82.jpg inflating: dataset/test/bald_eagle/c306b72e01a603ff1a.jpg inflating: dataset/test/bald_eagle/c3730840aa521362bf.jpg inflating: dataset/test/bald_eagle/c44259e27aed30f1f2.jpg inflating: dataset/test/bald_eagle/c513370679145025cc.jpg inflating: dataset/test/bald_eagle/c577e0c10c1245c74b.PNG inflating: dataset/test/bald_eagle/c58d8d211087b70df4.jpg inflating: dataset/test/bald_eagle/c8898c2a9fdd7c2e13.jpg inflating: dataset/test/bald_eagle/c990ba426dc2c21da3.jpg inflating: dataset/test/bald_eagle/cab16c16b76c5947d1.jpg inflating: dataset/test/bald_eagle/ccd3276e0f5253fbc9.jpg inflating: dataset/test/bald_eagle/cd794f557f6f360b9a.jpg inflating: dataset/test/bald_eagle/ce61248197ad0ed294.png inflating: dataset/test/bald_eagle/d0cb288df441de8a01.JPG inflating: dataset/test/bald_eagle/d2e1f72b88c83d0338.jpeg inflating: dataset/test/bald_eagle/d346206bf944ca6f35.jpg inflating: dataset/test/bald_eagle/d46ecb1decd39b2ee3.jpg inflating: dataset/test/bald_eagle/d4beb0301a0e1807d6.jpg inflating: dataset/test/bald_eagle/d5a365025a72b55359.jpg inflating: dataset/test/bald_eagle/d5e525c9d88d06a319.jpg inflating: dataset/test/bald_eagle/d5e7d38a24b520c277.jpg inflating: dataset/test/bald_eagle/d74d9dceced500b0d0.jpg inflating: dataset/test/bald_eagle/d9d1b382bc4f62d460.jpg inflating: dataset/test/bald_eagle/db0bcc457d74fb7d81.jpg inflating: dataset/test/bald_eagle/db894474709e088a10.jpg inflating: dataset/test/bald_eagle/dc57abcbeb1aff3995.jpg inflating: dataset/test/bald_eagle/dd32e17e3a2e4ae7ed.jpg inflating: dataset/test/bald_eagle/ddf0b29b9e85ba609b.jpg inflating: dataset/test/bald_eagle/de22a5f68a43f81f9f.jpg inflating: dataset/test/bald_eagle/e0e7fecd01ad11117b.jpg inflating: dataset/test/bald_eagle/e305b8f6ecc5d57d1c.jpg inflating: dataset/test/bald_eagle/e30cfcdde4bcdf896b.jpg inflating: dataset/test/bald_eagle/e4dbccd981b66dc86d.jpg inflating: dataset/test/bald_eagle/e5324ce84bca141464.jpg inflating: dataset/test/bald_eagle/e71bb7471b31d92629.jpg inflating: dataset/test/bald_eagle/e8a444cfeb9858ac82.jpg inflating: dataset/test/bald_eagle/eab479be726a1e0087.jpg inflating: dataset/test/bald_eagle/ebef7bfc32b7e36e2b.jpg inflating: dataset/test/bald_eagle/ec7bb106ef92c811e5.jpg inflating: dataset/test/bald_eagle/ede3e5886960728c5c.jpg inflating: dataset/test/bald_eagle/f0083895af25e7763d.png inflating: dataset/test/bald_eagle/f12ece291338beff55.jpg inflating: dataset/test/bald_eagle/f1bfa05aa10a2c769b.jpg inflating: dataset/test/bald_eagle/f1e662ac0fd1caa32e.jpg inflating: dataset/test/bald_eagle/f21326da2fc276f8b3.jpg inflating: dataset/test/bald_eagle/f22564fb5bd06cfba5.jpg inflating: dataset/test/bald_eagle/f88b3587dfedba5f1f.jpg inflating: dataset/test/bald_eagle/fbc187c0827cbaa09e.jpg inflating: dataset/test/bald_eagle/fbe74d0c6e01e5fa0e.jpg inflating: dataset/test/bald_eagle/fcb819ca3007cad23a.jpg inflating: dataset/test/bald_eagle/fe6e13dcf3efefdf94.jpg inflating: dataset/test/bald_eagle/feb99ddda61a3c0e5c.jpg inflating: dataset/test/bald_eagle/ffb3b3cfd400d10f38.jpg creating: dataset/test/elk/ inflating: dataset/test/elk/012853a1d1308344a3.jpg inflating: dataset/test/elk/0352f9d28af4052775.jpg inflating: dataset/test/elk/03f4fd4bbb4be6710e.jpg inflating: dataset/test/elk/04a2dbc03cd2af4684.jpg inflating: dataset/test/elk/04d61afc7fab8106c2.jpg inflating: dataset/test/elk/08df4ef2b8ba65b152.png inflating: dataset/test/elk/0bd65ab8745e0fd20c.jpg inflating: dataset/test/elk/0c026591904ffa0473.jpg inflating: dataset/test/elk/0caf2c30f8183cdeb2.jpg inflating: dataset/test/elk/0cb1366730ab718b22.jpg inflating: dataset/test/elk/0cfe93240e577a8bcc.jpg inflating: dataset/test/elk/0f2e3525327f12e65a.jpg inflating: dataset/test/elk/0fb8566f26b41c070f.jpg inflating: dataset/test/elk/0fd5f714c43d61414b.jpg inflating: dataset/test/elk/0fdb879753bc29acc2.jpg inflating: dataset/test/elk/104f58da081e421e47.jpg inflating: dataset/test/elk/134f197f1045d99f22.jpg inflating: dataset/test/elk/1470de5da969d7af43.jpg inflating: dataset/test/elk/157c3ef9e3e45109ed.jpg inflating: dataset/test/elk/15aeb58c3459151f83.jpg inflating: dataset/test/elk/16aa4a823d3c419c86.jpg inflating: dataset/test/elk/1700c9746294605fd4.jpg inflating: dataset/test/elk/17426efad4a79e61a1.jpg inflating: dataset/test/elk/1818cc04d3ee078f52.jpeg inflating: dataset/test/elk/185baf7e8503475456.jpg inflating: dataset/test/elk/191822f395ece6a308.jpg inflating: dataset/test/elk/1960254f069fe3f1c2.jpg inflating: dataset/test/elk/19afd4b2777e31233b.jpg inflating: dataset/test/elk/19f5a6f12debfb9ed6.jpg inflating: dataset/test/elk/1b1813d3aa16a0f39b.jpg inflating: dataset/test/elk/1b95b9716d62e64806.jpg inflating: dataset/test/elk/1ba6adce5e3177f794.jpg inflating: dataset/test/elk/1bfda699df425a1649.jpg inflating: dataset/test/elk/1c49c29352516eca02.jpg inflating: dataset/test/elk/1c4d353ccbc41a8d77.jpg inflating: dataset/test/elk/1d6725bd8011b1df49.jpg inflating: dataset/test/elk/1e7874b0fd44fdcd97.jpg inflating: dataset/test/elk/20089fd6b6536a9f8b.jpg inflating: dataset/test/elk/200fb132319207f47f.jpg inflating: dataset/test/elk/251e7f2fed253fbfc2.jpg inflating: dataset/test/elk/25f950df6a45f3474f.png inflating: dataset/test/elk/26da0639586c987235.jpg inflating: dataset/test/elk/26dcf45c7fb99ca943.jpg inflating: dataset/test/elk/27c437a459d3643d4a.jpg inflating: dataset/test/elk/2a8e914ff15fdf6b50.jpg inflating: dataset/test/elk/2b87d375bef5f4c671.jpg inflating: dataset/test/elk/2c0e89a782aecf225b.jpg inflating: dataset/test/elk/3030981a26f017351e.jpg inflating: dataset/test/elk/30a13507240ab20f24.jpg inflating: dataset/test/elk/30b5ba6579cd0401b0.jpg inflating: dataset/test/elk/30ded59becb6df5ef6.jpg inflating: dataset/test/elk/30f0e350b5adb0dd07.jpg inflating: dataset/test/elk/31bd20aeb5bae21d4f.jpg inflating: dataset/test/elk/32ad2ab7f77f282ba0.jpg inflating: dataset/test/elk/32ef2d1144247c4e7b.jpg inflating: dataset/test/elk/35cdc70d1a2db52fe9.jpg inflating: dataset/test/elk/385b256cde8786186f.jpg inflating: dataset/test/elk/39e1e6d96065316287.jpg inflating: dataset/test/elk/3a5330b8a974e0b6a9.jpg inflating: dataset/test/elk/3a72b0a86e65858567.jpg inflating: dataset/test/elk/3b08ef3569f6ae3740.jpg inflating: dataset/test/elk/3d360551049e9acede.jpg inflating: dataset/test/elk/3e795a3e3dcb04af33.jpg inflating: dataset/test/elk/3e96c0a047ca12896b.jpg inflating: dataset/test/elk/3fd6cde6f72520892c.jpg inflating: dataset/test/elk/3ff350ed6de66efc66.jpg inflating: dataset/test/elk/42feaf9bb97fad4701.jpg inflating: dataset/test/elk/46f2b3e0b4e2bb3780.jpg inflating: dataset/test/elk/4a83c312b0068d7060.jpg inflating: dataset/test/elk/4c8371652f042d91b6.jpg inflating: dataset/test/elk/4c9ca65899aaa46325.JPG inflating: dataset/test/elk/4d0aa7cd7e6a41697a.jpg inflating: dataset/test/elk/4d0ef9383b255c8db8.jpg inflating: dataset/test/elk/4de7ae51c68b1f98db.jpg inflating: dataset/test/elk/4f063fdee10863f762.jpg inflating: dataset/test/elk/4f6a68bb2b671ac4ba.jpg inflating: dataset/test/elk/4f7867f89aed9e4c9d.JPG inflating: dataset/test/elk/52127aa5a536274022.jpg inflating: dataset/test/elk/559438059a3f72136f.jpg inflating: dataset/test/elk/55dac785199979ea32.jpg inflating: dataset/test/elk/59bd12d1f9b0e9b109.jpg inflating: dataset/test/elk/5a1bc17afdbfb848af.jpg inflating: dataset/test/elk/5a24a1bfd0d741be3a.JPG inflating: dataset/test/elk/5a2a0e9f4365be00b2.jpg inflating: dataset/test/elk/5a8e919b720b756b1d.jpg inflating: dataset/test/elk/5b13dae928f94742ea.jpg inflating: dataset/test/elk/5baec6d7bf1e594ba4.jpg inflating: dataset/test/elk/5c5f8c8dda29dbd51c.jpg inflating: dataset/test/elk/5efb81fbc5ade14cbe.jpg inflating: dataset/test/elk/5f82b256076fabe3f3.jpg inflating: dataset/test/elk/62b79cad71b1620cb4.jpeg inflating: dataset/test/elk/62fa21d8bc5e7dfb19.jpg inflating: dataset/test/elk/642888fcf7abe01738.jpg inflating: dataset/test/elk/64af08b55c0add354a.jpg inflating: dataset/test/elk/66b7246d7d6ec21189.jpg inflating: dataset/test/elk/67ef59a8c8b4907cf8.jpg inflating: dataset/test/elk/6969a8b98da66ba4d2.jpg inflating: dataset/test/elk/6c379bccf7ff99dc5c.jpg inflating: dataset/test/elk/6c6c90cea5685274ab.jpg inflating: dataset/test/elk/6d01b09ea7697d591c.jpg inflating: dataset/test/elk/6d335b5925c882b94c.jpg inflating: dataset/test/elk/6f4226404e24f6b71e.jpg inflating: dataset/test/elk/6f9258fcbca2ce33ad.jpg inflating: dataset/test/elk/7230329c5db75afce9.jpg inflating: dataset/test/elk/72710dabfe857b3356.jpg inflating: dataset/test/elk/764ed26afae9a89599.jpg inflating: dataset/test/elk/78899a79fcab2214ad.jpg inflating: dataset/test/elk/7a5834e61594db1d94.jpg inflating: dataset/test/elk/7acdc6fbeacf356a27.jpg inflating: dataset/test/elk/7b043432136071a11d.jpg inflating: dataset/test/elk/7b65c5ae3adb503f4c.jpg inflating: dataset/test/elk/7d523cd825cd7e9c63.jpeg inflating: dataset/test/elk/7df14ed2a307c36c3d.jpg inflating: dataset/test/elk/7ea78181b6c7c70ed4.jpg inflating: dataset/test/elk/8001905e60a85889dc.jpg inflating: dataset/test/elk/8275437709d9cec1e0.jpg inflating: dataset/test/elk/84d31b9ba8ca2654a2.png inflating: dataset/test/elk/867685059f10628761.jpg inflating: dataset/test/elk/86f29dc5493af52470.jpg inflating: dataset/test/elk/883df17aae31fa8ab2.jpg inflating: dataset/test/elk/89f699b388adcc188c.jpg inflating: dataset/test/elk/8ba03fb1a55dd3a2a2.jpg inflating: dataset/test/elk/8cacbb44e5e5388f0d.jpg inflating: dataset/test/elk/8d4cb2645838ad8075.jpg inflating: dataset/test/elk/9275820b895e7e7770.jpg inflating: dataset/test/elk/95e673e91a877ee472.jpg inflating: dataset/test/elk/95ea7aa480d17e73e6.jpg inflating: dataset/test/elk/96737209db6ae6f961.jpg inflating: dataset/test/elk/992c161c642423b223.jpg inflating: dataset/test/elk/9cfb0e6b54d5b4525b.jpg inflating: dataset/test/elk/a1a811bd4ad18c144d.jpg inflating: dataset/test/elk/a1c3a5b373cef1631c.jpg inflating: dataset/test/elk/a8896f6119145d2004.jpg inflating: dataset/test/elk/aa6155f7375df934f3.jpg inflating: dataset/test/elk/abff35d7dda1d319c4.jpg inflating: dataset/test/elk/ae46d0a8cf04fe8d2b.jpg inflating: dataset/test/elk/b1db0e6dfcdcb6fcbb.jpg inflating: dataset/test/elk/b5e7f6a88ffdfe6cc5.jpeg inflating: dataset/test/elk/b6000206ed2d4d4ea9.jpg inflating: dataset/test/elk/b668232475990952f3.jpg inflating: dataset/test/elk/b6ef8cb741bc25fdc8.jpg inflating: dataset/test/elk/b7dd74b04db70ee935.jpg inflating: dataset/test/elk/bad2be9630420ff3b2.jpg inflating: dataset/test/elk/bae1cf718bf488733c.jpg inflating: dataset/test/elk/bc11a7a832c314c603.jpg inflating: dataset/test/elk/bd252e7e34df5088a6.png inflating: dataset/test/elk/c16833da2f05e87dd0.jpg inflating: dataset/test/elk/c1e580beabc70b18b6.jpg inflating: dataset/test/elk/c4db634a37373c5c9e.jpeg inflating: dataset/test/elk/c6c0941636929ace66.jpg inflating: dataset/test/elk/c75482256051a1093c.jpg inflating: dataset/test/elk/c7d17c05c93f45d3fb.jpg inflating: dataset/test/elk/c80760f4120d0379cd.jpg inflating: dataset/test/elk/ca9befc376d377b59e.jpg inflating: dataset/test/elk/cbff966d1435916ca0.jpg inflating: dataset/test/elk/cd1cf47754681d73ac.jpg inflating: dataset/test/elk/cd3f816a5ce1a29132.jpg inflating: dataset/test/elk/cd50874c8066080290.jpg inflating: dataset/test/elk/cf971865f5d7f39115.jpg inflating: dataset/test/elk/d0a0fc4f7b784f17ca.jpg inflating: dataset/test/elk/d1a4da055095e84c02.jpg inflating: dataset/test/elk/d2f05d1ddd88353a8f.jpg inflating: dataset/test/elk/d4a3f37665f50b31f2.jpg inflating: dataset/test/elk/d4aedc344200f87e60.jpg inflating: dataset/test/elk/d546d086c46adc695a.jpg inflating: dataset/test/elk/d585e17baa2da35695.jpg inflating: dataset/test/elk/dbccd5f346ce54d128.jpg inflating: dataset/test/elk/dc86d059a137f9ce02.jpg inflating: dataset/test/elk/e0a039f0c9e5abb702.jpg inflating: dataset/test/elk/e1b88c777926149b72.jpg inflating: dataset/test/elk/e367071b508e9f2ae1.jpg inflating: dataset/test/elk/e504daaaeba50fcab9.jpg inflating: dataset/test/elk/e57262770f54653ce0.jpg inflating: dataset/test/elk/e7ca5985e6cad0a73a.jpg inflating: dataset/test/elk/e85652119c948e5487.jpg inflating: dataset/test/elk/e8673cc5fbd5f1e3a1.jpg inflating: dataset/test/elk/e93a3b036d2d8ab466.jpg inflating: dataset/test/elk/ea1b40c299320ec054.jpg inflating: dataset/test/elk/eb16ad7fa01c7fdb26.jpg inflating: dataset/test/elk/eb6267fcff4270ef37.jpg inflating: dataset/test/elk/ee28fe4913e3ef2a89.jpg inflating: dataset/test/elk/eea7f65b1fced4f5df.jpg inflating: dataset/test/elk/eff12aadf60c16c05b.jpg inflating: dataset/test/elk/f033d0fe5c03c0e521.jpg inflating: dataset/test/elk/f0e334041e7b4bdb3c.jpg inflating: dataset/test/elk/f1f75f5abcd6300c5e.jpg inflating: dataset/test/elk/f232ea672cd4ae056d.png inflating: dataset/test/elk/f3dded3f95846d2ea2.jpg inflating: dataset/test/elk/f40df071337c203e02.jpg inflating: dataset/test/elk/f46bc6bdb583b38975.png inflating: dataset/test/elk/f4b8a288feea590c69.jpg inflating: dataset/test/elk/f7e698aca1d3342013.jpg inflating: dataset/test/elk/f80eedfbccedd7f426.jpg inflating: dataset/test/elk/f83b2363d30ce687b8.jpg inflating: dataset/test/elk/fc3d66eefac5ae2040.jpg inflating: dataset/test/elk/fd7b0b728ec71165e1.jpg inflating: dataset/test/elk/fdc3f742c94a2007a8.jpg inflating: dataset/test/elk/fecb41ecbe2a32ebf7.jpg creating: dataset/test/racoon/ inflating: dataset/test/racoon/0058b225d537a34fdf.jpg inflating: dataset/test/racoon/02dbd31a8b8c92bf0a.jpg inflating: dataset/test/racoon/0408ed7f88ae744c7e.jpg inflating: dataset/test/racoon/07e0eb894912aff5f2.jpg inflating: dataset/test/racoon/093369c6755e7e89e1.jpg inflating: dataset/test/racoon/09aa018a6f9c8e0543.jpg inflating: dataset/test/racoon/0b46a37d4b204a5a73.jpg inflating: dataset/test/racoon/0be057394e53cc9c1d.jpg inflating: dataset/test/racoon/0be51283a9d84d892d.jpg inflating: dataset/test/racoon/0c5e060d303a871d36.jpg inflating: dataset/test/racoon/0dfaa3ccee0d422aa9.jpg inflating: dataset/test/racoon/0e1d0e6236b979080e.jpg inflating: dataset/test/racoon/0e856df95bc39b0c1c.jpg inflating: dataset/test/racoon/10d20e9113c7dfd996.jpg inflating: dataset/test/racoon/114723d605a27db024.jpg inflating: dataset/test/racoon/1298ca158c8355dfd8.jpg inflating: dataset/test/racoon/12ca244bf5b36c9bc1.JPG inflating: dataset/test/racoon/13b9be8452cd92a0be.jpg inflating: dataset/test/racoon/15d5c64e97c592d8d5.jpg inflating: dataset/test/racoon/164ea71e4884b3d8b5.jpg inflating: dataset/test/racoon/16a3fbce2df27b89db.png inflating: dataset/test/racoon/174e0a0a481bf8d047.jpg inflating: dataset/test/racoon/190f444c62c62e1227.jpg inflating: dataset/test/racoon/19bc2a8210aa6b7bc6.jpg inflating: dataset/test/racoon/1aa861bc373e457dea.jpg inflating: dataset/test/racoon/1bb416cc98c65c07e1.jpg inflating: dataset/test/racoon/1bd3f95e6b44d89091.jpg inflating: dataset/test/racoon/1d0476a2a23912e1ec.jpg inflating: dataset/test/racoon/1d9c8425f3dd7fa6a5.jpg inflating: dataset/test/racoon/1dd9d3c9c0beba487f.jpg inflating: dataset/test/racoon/1e9d6cde1802ec582f.jpeg inflating: dataset/test/racoon/1f551b985f512fce00.jpg inflating: dataset/test/racoon/1fccaceadbe682a1a5.jpg inflating: dataset/test/racoon/2115f4a56625cf6875.jpg inflating: dataset/test/racoon/220c5b9888816f3e9c.jpg inflating: dataset/test/racoon/2433a475b45c4153a9.jpg inflating: dataset/test/racoon/255358a57999f09d15.jpg inflating: dataset/test/racoon/27ab1b98747226c560.jpg inflating: dataset/test/racoon/28c1021e4a676cfa8f.jpg inflating: dataset/test/racoon/28c1dfcda1de5ac286.jpg inflating: dataset/test/racoon/2bc2c3beb654d6b1e9.jpg inflating: dataset/test/racoon/2c12ea5738ea5a9de4.png inflating: dataset/test/racoon/313fd936f342fc16a7.jpg inflating: dataset/test/racoon/31591a28d157eb3448.jpg inflating: dataset/test/racoon/315d373802eb669f80.JPG inflating: dataset/test/racoon/333ec06010494fa3fc.jpg inflating: dataset/test/racoon/3486861a862e1bc32d.jpg inflating: dataset/test/racoon/3486af0e31ce3376cb.jpg inflating: dataset/test/racoon/39128adff67f0b00f3.jpg inflating: dataset/test/racoon/39c3e384cd5e7c5d4b.jpg inflating: dataset/test/racoon/3c5b7b2dc60e56e151.png inflating: dataset/test/racoon/3d623eeb99ad1967c1.jpg inflating: dataset/test/racoon/3e9bb7ba72a588e712.JPG inflating: dataset/test/racoon/3ea89e4b81d8fb9175.jpg inflating: dataset/test/racoon/3f2463e8aaedd8257e.jpg inflating: dataset/test/racoon/421cfa67ae7c6b5fea.jpg inflating: dataset/test/racoon/422e597aeffb752e75.jpg inflating: dataset/test/racoon/442b72b8ed47554808.jpg inflating: dataset/test/racoon/4444cb1439c7e2eda1.jpg inflating: dataset/test/racoon/48b33c94e3260c8324.jpg inflating: dataset/test/racoon/49e3b0d22028994a7d.JPG inflating: dataset/test/racoon/4a2060761de894ec3e.jpg inflating: dataset/test/racoon/4a263743de5eab2ea3.jpg inflating: dataset/test/racoon/4b3d1d7f43c6345c2f.jpg inflating: dataset/test/racoon/4c5b441364b9bc7bae.jpg inflating: dataset/test/racoon/4d5422ff7b9d1d17e4.png inflating: dataset/test/racoon/4dbc8e8abf3005ee7b.jpg inflating: dataset/test/racoon/4dec85e8c19c049868.jpg inflating: dataset/test/racoon/4dedec7ce467ae7770.jpg inflating: dataset/test/racoon/4defd265747cd5869c.jpg inflating: dataset/test/racoon/4e31bbe2ca6bfb2138.jpg inflating: dataset/test/racoon/501d1363cf437663bf.JPG inflating: dataset/test/racoon/525681f9b34ec5ae51.jpg inflating: dataset/test/racoon/52b6e273bd08b756eb.jpg inflating: dataset/test/racoon/53d4a494b9bb43ef0b.jpg inflating: dataset/test/racoon/572be36ce7e3663b9c.jpg inflating: dataset/test/racoon/575a38cf541eba9c8b.jpg inflating: dataset/test/racoon/57fcc8356d610f0d5b.jpg inflating: dataset/test/racoon/59f4357392d90abcb1.jpg inflating: dataset/test/racoon/5ad64a2c6255836816.jpg inflating: dataset/test/racoon/5bc535bda067d08491.jpg inflating: dataset/test/racoon/5d6eb91a8f28c3bae4.jpg inflating: dataset/test/racoon/5fc7d7d160af5b5652.jpg inflating: dataset/test/racoon/5fc9967ea07fb2d6fc.jpg inflating: dataset/test/racoon/62242935092fe7c069.jpg inflating: dataset/test/racoon/6232b45135325fa287.jpg inflating: dataset/test/racoon/6274877b59daa67b61.jpg inflating: dataset/test/racoon/645f9e7ab50aa3f958.jpg inflating: dataset/test/racoon/6689c8bb30ba38dc2f.jpg inflating: dataset/test/racoon/674701541ee91f6285.jpg inflating: dataset/test/racoon/68bcb4555a10e56ffe.jpg inflating: dataset/test/racoon/6b18b773c84028e213.jpg inflating: dataset/test/racoon/6c0d249edeecf05cc8.jpg inflating: dataset/test/racoon/6dd3173cf1f7d2440d.jpg inflating: dataset/test/racoon/6ea1d1848cd4a26fa3.jpg inflating: dataset/test/racoon/6f9ea35c108b8ff46c.jpg inflating: dataset/test/racoon/72073a9fceeb0bb10d.jpg inflating: dataset/test/racoon/7318d0d3894de5ad70.jpg inflating: dataset/test/racoon/73a69559dd161bc3e7.jpg inflating: dataset/test/racoon/748e4595de9332c463.jpg inflating: dataset/test/racoon/75b4313c7a7af0efa8.jpg inflating: dataset/test/racoon/767cc5afaf8af58290.jpg inflating: dataset/test/racoon/7774a165fdf4222a65.jpg inflating: dataset/test/racoon/77841ef1d0e14267f6.jpg inflating: dataset/test/racoon/78d87e5cccfdef7ab4.jpg inflating: dataset/test/racoon/795c43fa6bc617aabf.jpg inflating: dataset/test/racoon/79c8a4282b7323ad21.jpg inflating: dataset/test/racoon/7ad46681c3fc1c57a4.jpg inflating: dataset/test/racoon/7b822659878e6cd78c.jpg inflating: dataset/test/racoon/7bd2a3032672b9a884.jpg inflating: dataset/test/racoon/7bfeeb42f71d5b4db2.jpg inflating: dataset/test/racoon/7c01113f906a26e4e8.jpg inflating: dataset/test/racoon/7c1ece243d130ad1b2.jpg inflating: dataset/test/racoon/7e2be165e40dfdf8c1.jpg inflating: dataset/test/racoon/7e9654e2457fa8314e.jpg inflating: dataset/test/racoon/7f8c33112190ea1cf0.jpg inflating: dataset/test/racoon/813f4a1813eb658115.jpg inflating: dataset/test/racoon/815d1d68df4c4e6848.JPG inflating: dataset/test/racoon/83a4077dce88c31524.jpg inflating: dataset/test/racoon/83e82eb2a6df499b3b.jpg inflating: dataset/test/racoon/846696e1442c00afdd.jpg inflating: dataset/test/racoon/8524d6b6ff00f5d4f8.jpg inflating: dataset/test/racoon/85ef214536dcd4dd93.jpg inflating: dataset/test/racoon/8664f9c56506398b5b.jpg inflating: dataset/test/racoon/8885da02563061659c.jpg inflating: dataset/test/racoon/8b074bcd62c986c58a.jpg inflating: dataset/test/racoon/8be77b3ad365220d5f.jpg inflating: dataset/test/racoon/8c04014dd764fa9216.jpg inflating: dataset/test/racoon/8c2446458105fb2e3b.jpg inflating: dataset/test/racoon/8cfff96e2d08509af5.jpg inflating: dataset/test/racoon/902a654aa4f57bbd57.jpg inflating: dataset/test/racoon/91dc1902efcaa05a29.jpg inflating: dataset/test/racoon/9248f1354261a693db.jpg inflating: dataset/test/racoon/94fcbd865c4c4c3d05.jpg inflating: dataset/test/racoon/965d5a6a0aa973d5fc.jpg inflating: dataset/test/racoon/967f3dccd6bf073c35.jpg inflating: dataset/test/racoon/972342fb509e69c09c.jpg inflating: dataset/test/racoon/976c261dc2d18bcebc.jpg inflating: dataset/test/racoon/97baa0ca2ffe835108.jpg inflating: dataset/test/racoon/97be69172273b5d81c.jpg inflating: dataset/test/racoon/980b7557c6e6150e56.png inflating: dataset/test/racoon/9931844621b146b867.png inflating: dataset/test/racoon/995b8670470a3a07fd.jpg inflating: dataset/test/racoon/9aae409ca64a47d3bd.jpg inflating: dataset/test/racoon/9c94690b10de357d5d.png inflating: dataset/test/racoon/9c9553b3aedebcd09d.jpg inflating: dataset/test/racoon/9e762174c215d2db24.jpg inflating: dataset/test/racoon/a16e2c8ebbad98dd00.jpg inflating: dataset/test/racoon/a26dde1f503a4ca04d.jpg inflating: dataset/test/racoon/a2bfa91c23e5630c89.jpg inflating: dataset/test/racoon/a3e18d8650318feb4d.jpg inflating: dataset/test/racoon/a72fe1b6f2830d84c7.jpg inflating: dataset/test/racoon/a89884567e3d60bf9a.jpg inflating: dataset/test/racoon/a9951bc48ca53f74d2.jpg inflating: dataset/test/racoon/aaae5491125ff26279.jpg inflating: dataset/test/racoon/b05a2225c256d72667.gif inflating: dataset/test/racoon/b0617c47bca2bd96a3.jpg inflating: dataset/test/racoon/b1b4b6ab477bd2c159.jpg inflating: dataset/test/racoon/b265112bf4fa9d13fd.jpg inflating: dataset/test/racoon/b288e836d8b3ea71ab.jpg inflating: dataset/test/racoon/b51d0b5f02bd64960e.jpg inflating: dataset/test/racoon/b71684f6212cdfc5f6.jpg inflating: dataset/test/racoon/b7766353ad3caf2d18.png inflating: dataset/test/racoon/b81aa3e93125a51eb2.jpg inflating: dataset/test/racoon/b86e4634bae3d1081b.jpg inflating: dataset/test/racoon/b9d8efe1b740a92180.jpg inflating: dataset/test/racoon/ba5388cb069855e7b6.jpg inflating: dataset/test/racoon/ba739560f1351bbe97.jpg inflating: dataset/test/racoon/bbd8ed8766435680db.jpg inflating: dataset/test/racoon/bd7e7673784ad095c1.jpg inflating: dataset/test/racoon/c05f62173a5aea5b1e.jpg inflating: dataset/test/racoon/c1d41dedf23a99680a.jpg inflating: dataset/test/racoon/c3af644e8efe9bf8e1.jpg inflating: dataset/test/racoon/c5009925b7c176f987.jpg inflating: dataset/test/racoon/c7f1977a8dcb20c565.jpeg inflating: dataset/test/racoon/c84920c19ef5f84b3e.jpg inflating: dataset/test/racoon/c890f38aba2263d896.jpg inflating: dataset/test/racoon/ca30764e1267d64fa8.jpg inflating: dataset/test/racoon/cc160946d0a8f52ef4.png inflating: dataset/test/racoon/cf2784b37e37246f15.jpg inflating: dataset/test/racoon/cfd5d293e8c77b4fa9.jpg inflating: dataset/test/racoon/d06d78eafbbe96cd16.jpg inflating: dataset/test/racoon/d21048f39a1670f43e.jpg inflating: dataset/test/racoon/d260f26e3eca421fb0.jpg inflating: dataset/test/racoon/d35b4d11597f9935f8.jpg inflating: dataset/test/racoon/d3fd222694e851f05c.jpg inflating: dataset/test/racoon/d8beb806e476e70f29.jpg inflating: dataset/test/racoon/d99120229be5d59cad.jpg inflating: dataset/test/racoon/daf30375d03fb3dcef.jpg inflating: dataset/test/racoon/dbcab3ba9f239808c5.jpg inflating: dataset/test/racoon/dc0c6b8018c0b2a0f8.png inflating: dataset/test/racoon/dc327b9906bf512045.jpg inflating: dataset/test/racoon/de55ecc3f7686ce17f.jpg inflating: dataset/test/racoon/def2f7585d6d6368c0.JPG inflating: dataset/test/racoon/e0ec0b676a4eb30f3c.jpg inflating: dataset/test/racoon/e1a1207a25108dc458.jpg inflating: dataset/test/racoon/e2e85455b79654da1c.jpg inflating: dataset/test/racoon/e5d0e35168449e496d.jpeg inflating: dataset/test/racoon/e958c14a1b18be38aa.jpg inflating: dataset/test/racoon/e9bc5d28f8ad29df8a.jpg inflating: dataset/test/racoon/e9e9663163016ad7c7.jpg inflating: dataset/test/racoon/eac67c87246588c168.jpg inflating: dataset/test/racoon/eaea5f1eb63e1a6328.gif inflating: dataset/test/racoon/eaf7b2965ed59ebfe9.jpg inflating: dataset/test/racoon/ec01606a31a9369e49.jpg inflating: dataset/test/racoon/ec6ddf8a1828c2ef26.jpg inflating: dataset/test/racoon/f0b470f6cd8e237c8b.jpg inflating: dataset/test/racoon/f234b176a80fdeac0d.jpg inflating: dataset/test/racoon/f2ecab227741dc37bb.jpg inflating: dataset/test/racoon/f3023f6e586ffbfcfa.jpg inflating: dataset/test/racoon/f36d5b7ca1c2585113.JPG inflating: dataset/test/racoon/f7a5e1a014192c4c09.jpg inflating: dataset/test/racoon/f943fba1f46800e2d2.jpg inflating: dataset/test/racoon/fab800ef6afc8baf97.jpg inflating: dataset/test/racoon/fbadbabd208d79a813.jpg inflating: dataset/test/racoon/fca61868140fbbabdd.jpg inflating: dataset/test/racoon/fe4b1150d2afa05db1.jpg inflating: dataset/test/racoon/fe647759b440b0694a.png inflating: dataset/test/racoon/fe855620e6ffacaced.jpg creating: dataset/test/raven/ inflating: dataset/test/raven/014a24f535075af808.jpg inflating: dataset/test/raven/02596ecd8e21227db6.jpg inflating: dataset/test/raven/027677c87ac858e986.jpg inflating: dataset/test/raven/02dc0dd2888d16d8d8.jpg inflating: dataset/test/raven/02f591d9f9d9db4905.jpg inflating: dataset/test/raven/0574814c3ab3f32c29.jpg inflating: dataset/test/raven/060cce110b87af5691.jpg inflating: dataset/test/raven/062c78bfbcf82dbe15.jpeg inflating: dataset/test/raven/07a8180c1901342de6.jpg inflating: dataset/test/raven/07ec71d562b055542d.jpg inflating: dataset/test/raven/08886860e4cb9df50f.jpg inflating: dataset/test/raven/0ac4bb35a3a0c0f15c.jpg inflating: dataset/test/raven/0b4fcf8721592cd307.jpg inflating: dataset/test/raven/0b658da46220c7d8da.jpg inflating: dataset/test/raven/0e677f96cbdd450c58.png inflating: dataset/test/raven/0f971c33b5cd45e515.jpg inflating: dataset/test/raven/112f2264213995ff61.jpg inflating: dataset/test/raven/129aa856ae56f5a563.jpg inflating: dataset/test/raven/132bfea8002e8ab384.jpg inflating: dataset/test/raven/1365b394e06e974c17.jpg inflating: dataset/test/raven/1455fdfd9ccc1508d8.jpg inflating: dataset/test/raven/1457f8f044b28f0c4d.jpg inflating: dataset/test/raven/14bc291e14cbb034a0.jpg inflating: dataset/test/raven/156d96ab2b51274146.jpg inflating: dataset/test/raven/15e7fc216c4a35f033.jpg inflating: dataset/test/raven/17def50bb96a9777f2.jpg inflating: dataset/test/raven/1851a987d50a342781.jpg inflating: dataset/test/raven/1884b221cffdb28163.jpg inflating: dataset/test/raven/18a18bf541437088c1.jpg inflating: dataset/test/raven/19844351c0456df3b3.jpg inflating: dataset/test/raven/19f6a4968c75c6556d.jpg inflating: dataset/test/raven/19fcd5af5666438faf.jpg inflating: dataset/test/raven/1a385a6113ab933163.jpg inflating: dataset/test/raven/1a8bfa8c08a5350625.jpg inflating: dataset/test/raven/1acfa15a98a7a41991.jpg inflating: dataset/test/raven/1b0969711cf3dd4d53.jpg inflating: dataset/test/raven/1df175a3378e18a843.jpg inflating: dataset/test/raven/1e5dba34c9a351201a.jpg inflating: dataset/test/raven/1e6dcd844e9e7290af.jpg inflating: dataset/test/raven/23056ec904da40c140.jpg inflating: dataset/test/raven/236f0455ae10f6bf24.jpg inflating: dataset/test/raven/24c79355f932a0c38e.jpg inflating: dataset/test/raven/24f54c56637b5cf691.jpg inflating: dataset/test/raven/257c2bf88e61bb6bc7.jpg inflating: dataset/test/raven/25e450b14f4f0feccc.jpg inflating: dataset/test/raven/26504fc2dff58bf4f7.jpg inflating: dataset/test/raven/26b6bfc4dff86ac19d.jpg inflating: dataset/test/raven/2ab0129d87a6056425.jpg inflating: dataset/test/raven/2b4c71acdf040bedec.jpg inflating: dataset/test/raven/2b7eb5e14210f9b2c3.jpg inflating: dataset/test/raven/2cd2cdf42d90cded0b.jpg inflating: dataset/test/raven/2d6de8a73630ff3abb.jpg inflating: dataset/test/raven/2eb6c150deb919d9a1.jpg inflating: dataset/test/raven/301d4ac99ee1b73163.gif inflating: dataset/test/raven/31edb5cf93e4e954fe.jpg inflating: dataset/test/raven/338b48973f426146e6.jpg inflating: dataset/test/raven/342641214c7419ba31.jpg inflating: dataset/test/raven/34e7674fefb00848fb.png inflating: dataset/test/raven/3504b6136e2b996e51.jpeg inflating: dataset/test/raven/372b218d190b448aff.jpg inflating: dataset/test/raven/37b260a15f9ba5ac2c.jpg inflating: dataset/test/raven/38335af512681716d5.JPG inflating: dataset/test/raven/391483757271e13bf6.jpg inflating: dataset/test/raven/39273644adaf3b97af.jpg inflating: dataset/test/raven/3e1ee0164586c87674.jpg inflating: dataset/test/raven/3f5393654429eac68e.gif inflating: dataset/test/raven/418d30baae244a1f8c.jpg inflating: dataset/test/raven/4339bb1bcd16dffef8.jpg inflating: dataset/test/raven/43f71515af6e9514e9.jpg inflating: dataset/test/raven/449ce336e8d5b39a85.jpg inflating: dataset/test/raven/44c83e96fb0c95c8bc.jpg inflating: dataset/test/raven/461913ecd1d0022b57.jpg inflating: dataset/test/raven/470eed15d963d5f444.jpg inflating: dataset/test/raven/4884373b246ca0108d.jpg inflating: dataset/test/raven/4a0d2453d694853980.jpg inflating: dataset/test/raven/4f19533a7f219e46cc.jpg inflating: dataset/test/raven/509017a3ecdcde556c.jpg inflating: dataset/test/raven/5125cf2eab518cda0a.jpg inflating: dataset/test/raven/57e28de994829115f4.jpg inflating: dataset/test/raven/58ce900c539b59387a.jpg inflating: dataset/test/raven/5a79129a069fc8ce68.jpg inflating: dataset/test/raven/5b4828e107d6472415.jpg inflating: dataset/test/raven/5c11a42e4391c294d2.jpg inflating: dataset/test/raven/5c96693a30a421eadf.jpg inflating: dataset/test/raven/5d2891320fda83ff58.jpg inflating: dataset/test/raven/60934f92db38975126.jpg inflating: dataset/test/raven/610281b1f72504260c.jpg inflating: dataset/test/raven/613f11539b88721f0f.jpg inflating: dataset/test/raven/61dfbe25a25dc34cf5.jpg inflating: dataset/test/raven/640eceba1c69b8097b.jpg inflating: dataset/test/raven/6499e4f4c290578bfe.jpg inflating: dataset/test/raven/65f81c87b65ce7466d.jpg inflating: dataset/test/raven/66865143920747f741.png inflating: dataset/test/raven/67b73804e5e85125b2.png inflating: dataset/test/raven/68cbe2f21b37251eec.jpg inflating: dataset/test/raven/68dce134c758b3a779.jpg inflating: dataset/test/raven/6a7272860c597c2580.jpg inflating: dataset/test/raven/6cbd0df735649bde06.jpg inflating: dataset/test/raven/6eb469f9ff47bd8bff.jpg inflating: dataset/test/raven/6ec4e4e178a26afa0a.jpg inflating: dataset/test/raven/6f0a3f98540df63f92.jpg inflating: dataset/test/raven/6f4b4fbd6baccd3128.jpg inflating: dataset/test/raven/712cc96ee15ddb56c1.jpg inflating: dataset/test/raven/71bddc5219568e6404.jpg inflating: dataset/test/raven/71e2ecbc6d5c9ada69.jpg inflating: dataset/test/raven/728d77897df9a82372.jpg inflating: dataset/test/raven/73884b98b34ed9ab90.jpeg inflating: dataset/test/raven/76e202b8aa7b468755.png inflating: dataset/test/raven/79f87db715ae7bd985.jpg inflating: dataset/test/raven/7adb2698933147ff41.jpg inflating: dataset/test/raven/7d74166aaee5e5733a.jpg inflating: dataset/test/raven/7fb2de187b66f22217.jpg inflating: dataset/test/raven/7fb4edc3ab2528e91a.jpg inflating: dataset/test/raven/80bf9dce48f00fa14a.jpg inflating: dataset/test/raven/820d4c0228598e55b2.jpg inflating: dataset/test/raven/86e1f149a3f4fd1940.jpg inflating: dataset/test/raven/872a770a6a591795fb.jpg inflating: dataset/test/raven/88b9704957ab4e55ed.jpg inflating: dataset/test/raven/8c84074e1272e9770d.jpg inflating: dataset/test/raven/8da5382c99c5f71122.jpg inflating: dataset/test/raven/8dd82ad76f9116afa0.jpg inflating: dataset/test/raven/9083968d273c781bb1.jpg inflating: dataset/test/raven/93092835252ce71f0f.png inflating: dataset/test/raven/9408265cb4bee611e2.jpg inflating: dataset/test/raven/9a41f31192873b0d2a.jpg inflating: dataset/test/raven/9f3718e4f2e8f81e5a.jpg inflating: dataset/test/raven/9fe2a7d65252756452.jpg inflating: dataset/test/raven/9ff43bc1c4e98d92e2.jpg inflating: dataset/test/raven/9ff78aeea2d1f391ae.jpg inflating: dataset/test/raven/a0321a4b469f7bf185.jpg inflating: dataset/test/raven/a54da9e5ab0e2eb537.jpg inflating: dataset/test/raven/a6b5110dec597d27b9.jpg inflating: dataset/test/raven/a81387b3b81b10ac18.jpeg inflating: dataset/test/raven/a8bca33c573bf32455.jpg inflating: dataset/test/raven/a95c17af4bcb2f9313.jpg inflating: dataset/test/raven/a9a11e8c5494db3efc.png inflating: dataset/test/raven/a9acb08fce8f43b45d.jpg inflating: dataset/test/raven/ab94ac06faa497ca5d.jpg inflating: dataset/test/raven/ad514f24087465cfe7.jpg inflating: dataset/test/raven/afd29863ed3f9044e5.jpg inflating: dataset/test/raven/b2ab17baadbf881496.jpg inflating: dataset/test/raven/b5e9eb21664969113e.jpg inflating: dataset/test/raven/b61d6b641355fcb767.jpg inflating: dataset/test/raven/b7c39c9d83ae4ae3e8.jpg inflating: dataset/test/raven/bd65b337223a2c503a.jpg inflating: dataset/test/raven/bff0e236e387c8cd6f.jpg inflating: dataset/test/raven/c32acf94fba6a00bd6.jpg inflating: dataset/test/raven/c365ef1c862db0cd4c.jpg inflating: dataset/test/raven/c4e5c09a748d6209ef.jpg inflating: dataset/test/raven/c91d37ab3f0106524b.jpg inflating: dataset/test/raven/ca5cfc9fef51b8914a.jpg inflating: dataset/test/raven/cb941dc5b08fbe446f.jpg inflating: dataset/test/raven/cc99bc0073575cb371.jpg inflating: dataset/test/raven/ccf4952fadf07e9bb3.jpg inflating: dataset/test/raven/ce7287f823b6c469f2.jpg inflating: dataset/test/raven/cfeb9b6b9d702efa4b.jpeg inflating: dataset/test/raven/d03791850ed56aa28b.jpg inflating: dataset/test/raven/d2579c16fc3af449ad.jpg inflating: dataset/test/raven/d7fbcb31197450f984.jpg inflating: dataset/test/raven/d9cd2a8e8b1220ca5b.jpg inflating: dataset/test/raven/da9959442cbe92842b.jpg inflating: dataset/test/raven/dd8c3c76798129e13f.jpg inflating: dataset/test/raven/ddb9322c0a6398b9df.jpg inflating: dataset/test/raven/de48a750135b9c415c.jpg inflating: dataset/test/raven/ded2575a00d8420531.jpg inflating: dataset/test/raven/e089c507c185fe99e5.jpg inflating: dataset/test/raven/e45ccedbc19689bb1d.jpg inflating: dataset/test/raven/e4a4d36881e703ff3e.jpg inflating: dataset/test/raven/e51e7169af12f4efba.jpg inflating: dataset/test/raven/e74c75e4d416931ca8.jpg inflating: dataset/test/raven/e84916b8dc767a3fc5.jpg inflating: dataset/test/raven/e9f32c2171ab34b718.jpg inflating: dataset/test/raven/ea993c0847231d6b3a.jpg inflating: dataset/test/raven/ec5250b624d8570ff6.JPG inflating: dataset/test/raven/ec7b2fa2e41d3394a3.jpg inflating: dataset/test/raven/ece55a3f8c3ecea5ba.jpg inflating: dataset/test/raven/ed71d69f1e61205d36.jpg inflating: dataset/test/raven/ede3385c0b87ccc091.jpg inflating: dataset/test/raven/ee56fa25b14c297704.jpg inflating: dataset/test/raven/eef39b4c57b40b146d.jpg inflating: dataset/test/raven/f15e0efdadd9837089.jpg inflating: dataset/test/raven/f33e151ae9b85148b4.jpg inflating: dataset/test/raven/f4910084f4136f99c4.jpg inflating: dataset/test/raven/f60b766aea90c71611.jpg inflating: dataset/test/raven/f78c48aa9dc0f7288f.jpg inflating: dataset/test/raven/f8ed976eb8b8a29d71.jpg inflating: dataset/test/raven/fa746b0c37b5cb3ec1.jpg inflating: dataset/test/raven/fb15919b60780d5be9.jpg inflating: dataset/test/raven/fbe6d6ff1a15d7aab9.jpg inflating: dataset/test/raven/fc559fd964d6e5ca7a.png inflating: dataset/test/raven/fc659796db407afb8c.jpg inflating: dataset/test/raven/fca560f4b03a2fe89a.jpg inflating: dataset/test/raven/fd2c583b584dd5e2ca.jpg inflating: dataset/test/raven/fdac78f74e76d13bb0.jpg inflating: dataset/test/raven/fdeb8e1809a7556332.jpg inflating: dataset/test/raven/fe5677587873cc8044.jpg inflating: dataset/test/raven/ff8f63bf972bf1f13f.jpg creating: dataset/train/ creating: dataset/train/bald_eagle/ inflating: dataset/train/bald_eagle/00e148aeea989ba56b.JPG inflating: dataset/train/bald_eagle/019edce49ef404db98.jpg inflating: dataset/train/bald_eagle/02d1e1ddfce422b881.jpg inflating: dataset/train/bald_eagle/037764efc31f00fb1d.jpg inflating: dataset/train/bald_eagle/04474682b5a82f6bff.jpg inflating: dataset/train/bald_eagle/05744a7110af55c085.jpg inflating: dataset/train/bald_eagle/0599376f745bfe03bf.jpg inflating: dataset/train/bald_eagle/05c9c59094cf0f38df.jpg inflating: dataset/train/bald_eagle/05ed9e5a7ece9bb761.jpg inflating: dataset/train/bald_eagle/06059724e4037056e8.jpg inflating: dataset/train/bald_eagle/060ae23654b5eb141c.jpg inflating: dataset/train/bald_eagle/060da1a96cfdd6929a.jpg inflating: dataset/train/bald_eagle/065c6ce213d36625d6.jpg inflating: dataset/train/bald_eagle/0675606cc29ef2403e.jpg inflating: dataset/train/bald_eagle/08882aac9e5b922c5b.jpg inflating: dataset/train/bald_eagle/08ae1bf2461ab8909e.jpg inflating: dataset/train/bald_eagle/09465c318d37b054ab.png inflating: dataset/train/bald_eagle/0a37e2dba1c89ea4b7.jpg inflating: dataset/train/bald_eagle/0b2f640b94a625e21e.jpeg inflating: dataset/train/bald_eagle/0c26b6ddb61fd8e8d9.jpg inflating: dataset/train/bald_eagle/0c79417e44b9523c60.jpg inflating: dataset/train/bald_eagle/0d61d5f95e66e535d1.jpg inflating: dataset/train/bald_eagle/0da854bf125fd9e31b.jpg inflating: dataset/train/bald_eagle/0db71df03d631dfe94.jpg inflating: dataset/train/bald_eagle/0e56557c1c81927ebb.jpg inflating: dataset/train/bald_eagle/0e745380791e9686c0.jpg inflating: dataset/train/bald_eagle/0ef148ffde892cb4fe.jpg inflating: dataset/train/bald_eagle/0f2b0f2107e928768b.jpg inflating: dataset/train/bald_eagle/0fb90613b639f93aa2.jpg inflating: dataset/train/bald_eagle/1055be731faa011a1c.jpg inflating: dataset/train/bald_eagle/1121285d46990d6f54.jpg inflating: dataset/train/bald_eagle/113b552a679eea4a27.jpg inflating: dataset/train/bald_eagle/1289fcb0a241f85b88.jpg inflating: dataset/train/bald_eagle/12abacf3fe925b1d4e.jpg inflating: dataset/train/bald_eagle/136eb35ae8371c9d95.jpg inflating: dataset/train/bald_eagle/1476060d3d7b0def34.jpg inflating: dataset/train/bald_eagle/150ebd3ad774fb0f94.jpg inflating: dataset/train/bald_eagle/16981d1ed3fbbe1fd6.jpg inflating: dataset/train/bald_eagle/16acb73721d825170a.jpg inflating: dataset/train/bald_eagle/16ecacbc30288e9e20.jpg inflating: dataset/train/bald_eagle/180a6e71c3688b4ea5.jpg inflating: dataset/train/bald_eagle/18f946a37be808cc58.jpg inflating: dataset/train/bald_eagle/1a56022b2e142b9b63.jpg inflating: dataset/train/bald_eagle/1a73c83d2aefcaee85.jpg inflating: dataset/train/bald_eagle/1b70f029c9ca5ca64d.jpg inflating: dataset/train/bald_eagle/1c5de9c7fef7c6599a.JPG inflating: dataset/train/bald_eagle/1c63289034dc1c61e2.jpg inflating: dataset/train/bald_eagle/1e3d2eac2b01262c60.jpg inflating: dataset/train/bald_eagle/1fcd31eae27cae3a52.jpg inflating: dataset/train/bald_eagle/20b33e668d35b695fc.jpg inflating: dataset/train/bald_eagle/21050693f78421752f.jpg inflating: dataset/train/bald_eagle/2125a12b451a026c77.jpg inflating: dataset/train/bald_eagle/21c7c6ba9b208f1dcb.jpg inflating: dataset/train/bald_eagle/2215ecd374c30dbfdf.jpg inflating: dataset/train/bald_eagle/22315f3af8071c0d9d.jpg inflating: dataset/train/bald_eagle/227e572b2a5efe4e4c.JPG inflating: dataset/train/bald_eagle/22ff85652f00271704.jpg inflating: dataset/train/bald_eagle/2318897cefcbd5b01e.jpg inflating: dataset/train/bald_eagle/23f29a55531e1084f3.jpg inflating: dataset/train/bald_eagle/243823126b77210561.jpg inflating: dataset/train/bald_eagle/2479e336a52ac87ae1.jpg inflating: dataset/train/bald_eagle/25736213b5e7925641.jpg inflating: dataset/train/bald_eagle/259afd0f829cbcf686.jpg inflating: dataset/train/bald_eagle/25dd6c38cba99c64d9.jpg inflating: dataset/train/bald_eagle/25f72baf93e07ed3af.jpg inflating: dataset/train/bald_eagle/2645f9977cba13323a.jpg inflating: dataset/train/bald_eagle/26ae1a6c4c68d54286.jpg inflating: dataset/train/bald_eagle/278cc35aece9b5f447.jpg inflating: dataset/train/bald_eagle/27c52b65092533f0a5.JPG inflating: dataset/train/bald_eagle/27e6f3573a01a92fba.jpg inflating: dataset/train/bald_eagle/281ac47a3399e1a546.jpg inflating: dataset/train/bald_eagle/28215f74259acbc105.jpg inflating: dataset/train/bald_eagle/28ec0bfd5ad1251fda.jpg inflating: dataset/train/bald_eagle/292a7cc6adf9b60e4f.jpg inflating: dataset/train/bald_eagle/29d220dc169eacc1b1.jpg inflating: dataset/train/bald_eagle/2b6463417f833b1010.jpg inflating: dataset/train/bald_eagle/2b71d7aecd511c02e3.jpg inflating: dataset/train/bald_eagle/2bff93fc55aefcbd9e.jpg inflating: dataset/train/bald_eagle/2c5d970a07f885d174.jpg inflating: dataset/train/bald_eagle/2c7d3205ddde47a2f9.jpg inflating: dataset/train/bald_eagle/2d0bfed570a6c024e0.jpg inflating: dataset/train/bald_eagle/2dabc434a1ee44d8b4.jpg inflating: dataset/train/bald_eagle/2db26287be968941c1.jpg inflating: dataset/train/bald_eagle/2e20fffce8c396dea3.jpg inflating: dataset/train/bald_eagle/2e3213af707e634650.jpg inflating: dataset/train/bald_eagle/2f607e81311b620e49.jpg inflating: dataset/train/bald_eagle/30f1a280bf27d2ca1c.jpg inflating: dataset/train/bald_eagle/30f7dc316648105642.jpg inflating: dataset/train/bald_eagle/323e5b3ecaf322104a.jpg inflating: dataset/train/bald_eagle/32892c845f516e3c9e.jpg inflating: dataset/train/bald_eagle/32b7f31391336cebcd.jpg inflating: dataset/train/bald_eagle/33c07330420f49d18a.jpg inflating: dataset/train/bald_eagle/3404bd7cca83f82e34.jpg inflating: dataset/train/bald_eagle/3427cb188884503e50.jpg inflating: dataset/train/bald_eagle/344e3ff686b45bde36.jpg inflating: dataset/train/bald_eagle/34a510aca1f3d696e6.jpg inflating: dataset/train/bald_eagle/34c35b173c6fb9f142.jpg inflating: dataset/train/bald_eagle/36a71189699b6ad5c9.jpg inflating: dataset/train/bald_eagle/36fbba09e0b1443531.jpg inflating: dataset/train/bald_eagle/3712fe098110d75859.jpg inflating: dataset/train/bald_eagle/37acbcf417e942c52e.jpg inflating: dataset/train/bald_eagle/37bc3515fd61851058.jpg inflating: dataset/train/bald_eagle/3801bf6e6499a81983.jpg inflating: dataset/train/bald_eagle/3878ff0396b025712c.jpg inflating: dataset/train/bald_eagle/389756953b962fc498.jpg inflating: dataset/train/bald_eagle/38e1da89744420a2cc.jpg inflating: dataset/train/bald_eagle/3946b14217afd06e00.jpg inflating: dataset/train/bald_eagle/399e546d54e3bce00c.jpg inflating: dataset/train/bald_eagle/39d75d3e008126559e.JPG inflating: dataset/train/bald_eagle/39ebf154cd371b9e9a.jpg inflating: dataset/train/bald_eagle/3a505816d3bd974a23.JPG inflating: dataset/train/bald_eagle/3ba9cc9573e1e0c475.png inflating: dataset/train/bald_eagle/3bb85b144b69808b51.jpg inflating: dataset/train/bald_eagle/3bc862e3d59b5225fa.jpg inflating: dataset/train/bald_eagle/3bd271963e64c89016.jpg inflating: dataset/train/bald_eagle/3bfd7595c4c040a4b2.jpg inflating: dataset/train/bald_eagle/3cc1a81596a5ec4ed7.jpg inflating: dataset/train/bald_eagle/3da3998fd962ca20d5.jpg inflating: dataset/train/bald_eagle/3dcaa5acc4a1d3c5f4.jpg inflating: dataset/train/bald_eagle/3e4d69128d6a7bcb55.jpg inflating: dataset/train/bald_eagle/3e9a085b80f45e5728.png inflating: dataset/train/bald_eagle/3f0421348d86060910.jpg inflating: dataset/train/bald_eagle/3f96c8676dd7a3065c.jpg inflating: dataset/train/bald_eagle/3fd12e1bf1c32b3a82.jpg inflating: dataset/train/bald_eagle/3fd25faccfe9e92f2a.jpg inflating: dataset/train/bald_eagle/3ffa139b21cb441f34.jpg inflating: dataset/train/bald_eagle/401d0bc171d8bec911.jpg inflating: dataset/train/bald_eagle/4033cfa8c8baa58621.jpg inflating: dataset/train/bald_eagle/408cdf2649cb78e180.jpg inflating: dataset/train/bald_eagle/427985d25d2f328a87.jpg inflating: dataset/train/bald_eagle/431098b01c416c5a2d.JPG inflating: dataset/train/bald_eagle/439fcea6a93e82e5f0.jpg inflating: dataset/train/bald_eagle/452ed12e1ce51d4369.jpg inflating: dataset/train/bald_eagle/459389e078360901fe.jpg inflating: dataset/train/bald_eagle/47446680fa88649b9e.jpg inflating: dataset/train/bald_eagle/4796a7a1af468dc44d.jpg inflating: dataset/train/bald_eagle/48514e1a96518734a9.jpg inflating: dataset/train/bald_eagle/4853da5821cab7e7ac.jpg inflating: dataset/train/bald_eagle/489084a00bfa38468c.jpg inflating: dataset/train/bald_eagle/48dde5c1d98644bf1d.jpg inflating: dataset/train/bald_eagle/493795386237b4b627.jpg inflating: dataset/train/bald_eagle/498d859f24508915ae.jpg inflating: dataset/train/bald_eagle/4b1a508e56138b0e09.jpg inflating: dataset/train/bald_eagle/4c1e35fdac382f5499.jpg inflating: dataset/train/bald_eagle/4cfaa3f8d170b9bcfc.JPG inflating: dataset/train/bald_eagle/4dec61a9005127b709.jpg inflating: dataset/train/bald_eagle/4e497950f2ef8dc509.jpg inflating: dataset/train/bald_eagle/4e5f7ecf6564e09723.PNG inflating: dataset/train/bald_eagle/4e8ddcefaa0d593c53.jpg inflating: dataset/train/bald_eagle/4eb73922722ba1f0da.jpg inflating: dataset/train/bald_eagle/4ecd4b874aa5e56bfa.jpg inflating: dataset/train/bald_eagle/4f1752bf010256b988.jpg inflating: dataset/train/bald_eagle/4f178a81f599dc8b09.jpg inflating: dataset/train/bald_eagle/4f69b980ac6a16fb19.jpg inflating: dataset/train/bald_eagle/509775c26d4abc1261.jpg inflating: dataset/train/bald_eagle/50e04c18d091f24a31.jpg inflating: dataset/train/bald_eagle/5154b3ae0b0bcd293e.jpg inflating: dataset/train/bald_eagle/51ed26b5d862cc9c9d.jpg inflating: dataset/train/bald_eagle/5241f3690f17d5ff15.jpg inflating: dataset/train/bald_eagle/52a296a8b5f176bcc5.jpg inflating: dataset/train/bald_eagle/52d2e63f7bd983b0b1.jpg inflating: dataset/train/bald_eagle/52e9f6eff2388b13bb.jpg inflating: dataset/train/bald_eagle/52ee7f603fe97aaa5f.jpg inflating: dataset/train/bald_eagle/534a0be655ce515ecc.jpg inflating: dataset/train/bald_eagle/536b142185cdfd1295.jpg inflating: dataset/train/bald_eagle/53e42a242fd2554373.jpg inflating: dataset/train/bald_eagle/540d69d1daeca035cc.jpg inflating: dataset/train/bald_eagle/54ab52ba114da1949f.jpg inflating: dataset/train/bald_eagle/568064c7813ea733a2.jpg inflating: dataset/train/bald_eagle/571bc3ff943e3436cb.jpg inflating: dataset/train/bald_eagle/5742f36ae54bb6cb08.jpg inflating: dataset/train/bald_eagle/5903b28f5a31ef051a.jpg inflating: dataset/train/bald_eagle/5903cada0d28d9a77c.jpg inflating: dataset/train/bald_eagle/592a44af6499df156c.jpg inflating: dataset/train/bald_eagle/594ece12c2e3967325.jpg inflating: dataset/train/bald_eagle/5b472b3efc64846d5e.jpg inflating: dataset/train/bald_eagle/5b84e37d9ee4931a7c.jpg inflating: dataset/train/bald_eagle/5b858c79f0233f4a20.jpg inflating: dataset/train/bald_eagle/5e211581b04733bdff.jpg inflating: dataset/train/bald_eagle/5ecb5a20695a836c11.jpg inflating: dataset/train/bald_eagle/5f3b51e14b91e697fa.jpg inflating: dataset/train/bald_eagle/5f499fb7b2de9e1b08.jpg inflating: dataset/train/bald_eagle/5fb7526dfd65816fe8.jpg inflating: dataset/train/bald_eagle/60a2350b27026a3e8c.jpg inflating: dataset/train/bald_eagle/60f4681d175091ab8a.jpg inflating: dataset/train/bald_eagle/60f8c367513bc1b7ed.jpg inflating: dataset/train/bald_eagle/614594cac8d0e3c4e1.jpg inflating: dataset/train/bald_eagle/618f27447a4da8a6f6.jpg inflating: dataset/train/bald_eagle/61c28be669b28a6888.jpg inflating: dataset/train/bald_eagle/61cfa2226f35c9f1ba.jpg inflating: dataset/train/bald_eagle/61db22ba75ee746b29.jpg inflating: dataset/train/bald_eagle/61f294419c6bdca471.jpg inflating: dataset/train/bald_eagle/627cd5e5f85ebb823c.jpg inflating: dataset/train/bald_eagle/631982f126b6a06ea6.jpg inflating: dataset/train/bald_eagle/63f0a6e6e383c87a51.jpg inflating: dataset/train/bald_eagle/6413c3403b0e71145c.jpg inflating: dataset/train/bald_eagle/646c5eda4d164a336d.jpg inflating: dataset/train/bald_eagle/64a7024f26e8fd30a7.jpg inflating: dataset/train/bald_eagle/64a90a8aa775ae42c8.jpg inflating: dataset/train/bald_eagle/6586bd3d62dafa6a9c.jpg inflating: dataset/train/bald_eagle/65c913a25773328369.jpg inflating: dataset/train/bald_eagle/6609396eaf8e42b4ca.jpg inflating: dataset/train/bald_eagle/66e0328906485efd6f.jpg inflating: dataset/train/bald_eagle/66e79c84deb5dfae1c.jpg inflating: dataset/train/bald_eagle/67a93dd3ae4caa8404.jpg inflating: dataset/train/bald_eagle/6804aa8bcc17607aaa.jpg inflating: dataset/train/bald_eagle/68971b2da6113a250f.jpg inflating: dataset/train/bald_eagle/6a125f6103ea74db66.JPG inflating: dataset/train/bald_eagle/6a3249803ddae755ab.jpg inflating: dataset/train/bald_eagle/6a7ddee338e580bd96.jpg inflating: dataset/train/bald_eagle/6b4eb6047c95ff104d.jpg inflating: dataset/train/bald_eagle/6b9b6fa9e5c3c4e803.png inflating: dataset/train/bald_eagle/6ba4e131adc5ad73a8.jpg inflating: dataset/train/bald_eagle/6be437029e906e7b8b.jpg inflating: dataset/train/bald_eagle/6be7f847a07ac01e85.jpg inflating: dataset/train/bald_eagle/6c4554b7e4f79c6297.JPG inflating: dataset/train/bald_eagle/6cb061d0629ea033f8.jpg inflating: dataset/train/bald_eagle/6d6192f8ee3cd0000c.jpg inflating: dataset/train/bald_eagle/6d765e314ac561de9b.jpg inflating: dataset/train/bald_eagle/6d8356e55695ddf3e6.jpg inflating: dataset/train/bald_eagle/6dc98c477fb732df35.jpg inflating: dataset/train/bald_eagle/6df7890c969b4193ad.jpg inflating: dataset/train/bald_eagle/6e0755ce13a88f84a3.jpg inflating: dataset/train/bald_eagle/6e2ed8ea9062a51233.jpg inflating: dataset/train/bald_eagle/6e44049fab00e25826.png inflating: dataset/train/bald_eagle/6ebffc26fbf61c08ff.jpg inflating: dataset/train/bald_eagle/6fccd2fe571d5cadec.jpg inflating: dataset/train/bald_eagle/71031a2c905d28568c.jpg inflating: dataset/train/bald_eagle/718589b6edecc2e4c3.jpg inflating: dataset/train/bald_eagle/72bbd869ecefd57dee.jpg inflating: dataset/train/bald_eagle/742c3e2603fea9c13d.jpg inflating: dataset/train/bald_eagle/74fc463a0a5e2ab9c2.jpg inflating: dataset/train/bald_eagle/75c796f3668cd60822.jpg inflating: dataset/train/bald_eagle/76191c7da8b98b7267.jpg inflating: dataset/train/bald_eagle/76c88cde049c1a5acb.jpg inflating: dataset/train/bald_eagle/7799f2e40ca0d6ee42.jpg inflating: dataset/train/bald_eagle/78a7a59ce22eb27c93.jpg inflating: dataset/train/bald_eagle/79415a156a7b33f331.jpg inflating: dataset/train/bald_eagle/79a67150c6529e56f5.jpg inflating: dataset/train/bald_eagle/7a4a9d86a82a012db4.jpg inflating: dataset/train/bald_eagle/7af5269125870db3cf.jpg inflating: dataset/train/bald_eagle/7afbb46d27032dfc55.jpg inflating: dataset/train/bald_eagle/7c3731cbdc9f3df6a1.jpg inflating: dataset/train/bald_eagle/7d64a9cee8e15a4dc5.jpg inflating: dataset/train/bald_eagle/7e98c79e409c0f9b38.jpg inflating: dataset/train/bald_eagle/7f12ca958c3dfbd378.jpg inflating: dataset/train/bald_eagle/7f5dde4a2191dd1480.jpg inflating: dataset/train/bald_eagle/7fa713198a4420df4c.jpg inflating: dataset/train/bald_eagle/7fc7f48bfd10e30d47.jpg inflating: dataset/train/bald_eagle/7ffc1b00942185b58b.jpg inflating: dataset/train/bald_eagle/8059853d4e68e1b93f.jpg inflating: dataset/train/bald_eagle/80f82e14370b98c664.jpg inflating: dataset/train/bald_eagle/810258a8327f366972.jpg inflating: dataset/train/bald_eagle/8102abd6f50302a35e.jpg inflating: dataset/train/bald_eagle/8114eb250cebf3c4f2.jpg inflating: dataset/train/bald_eagle/817801dffca3731340.jpg inflating: dataset/train/bald_eagle/822809e4c210a29938.jpg inflating: dataset/train/bald_eagle/822fb29024f6da1e65.jpg inflating: dataset/train/bald_eagle/8505c3dd4acb286087.jpg inflating: dataset/train/bald_eagle/856e32f5a5f265990b.jpg inflating: dataset/train/bald_eagle/867ef8ce485a84cb22.jpg inflating: dataset/train/bald_eagle/868be0141e0bb06c12.jpg inflating: dataset/train/bald_eagle/86a721bd27327331f4.jpg inflating: dataset/train/bald_eagle/86d15b9e284e9d0a8a.jpg inflating: dataset/train/bald_eagle/8731423f51fe546bcf.jpg inflating: dataset/train/bald_eagle/8735f9cd9cfb793a4c.jpg inflating: dataset/train/bald_eagle/88343f1e5489eef4ea.jpg inflating: dataset/train/bald_eagle/884af600ab9d71b914.jpg inflating: dataset/train/bald_eagle/8867cc629105c97cfc.jpg inflating: dataset/train/bald_eagle/88908080de8ceeda5b.jpg inflating: dataset/train/bald_eagle/893cb200a4008962c7.JPG inflating: dataset/train/bald_eagle/895e333ad4988ae8cf.gif inflating: dataset/train/bald_eagle/897b31039f819c59bb.jpg inflating: dataset/train/bald_eagle/89964bd2fc930ed53f.jpg inflating: dataset/train/bald_eagle/899a6b7da3ba6363e9.jpg inflating: dataset/train/bald_eagle/8ab0db711ec8242349.jpg inflating: dataset/train/bald_eagle/8abc5bfe07b5f6dffe.jpeg inflating: dataset/train/bald_eagle/8ad7b5539a74dd0169.jpg inflating: dataset/train/bald_eagle/8c009531b1d0550c10.jpg inflating: dataset/train/bald_eagle/8c4b9430514e815ee9.jpg inflating: dataset/train/bald_eagle/8c85613a47419695fd.jpg inflating: dataset/train/bald_eagle/8da038bd709816c598.jpg inflating: dataset/train/bald_eagle/8dc85e2c41e317088f.jpg inflating: dataset/train/bald_eagle/8dcac7f0d348adb0ef.jpg inflating: dataset/train/bald_eagle/8e58efc0dff58b8bb9.jpg inflating: dataset/train/bald_eagle/8e5cef135bfb46fc96.jpg inflating: dataset/train/bald_eagle/8ee9ec9510fe264c4a.jpg inflating: dataset/train/bald_eagle/8eedf6ab1076727811.jpg inflating: dataset/train/bald_eagle/8f4e21ffa22434c01d.jpg inflating: dataset/train/bald_eagle/8ffb81ca29a6beb974.jpg inflating: dataset/train/bald_eagle/921c3f0fba6c958844.jpg inflating: dataset/train/bald_eagle/9221c8cdc7538853fa.jpg inflating: dataset/train/bald_eagle/92df87cf11bb26b03f.jpg inflating: dataset/train/bald_eagle/93450e1fba1986e512.jpg inflating: dataset/train/bald_eagle/9417cdea02f2c1e07e.jpg inflating: dataset/train/bald_eagle/955eaf131ee1d7b31e.jpg inflating: dataset/train/bald_eagle/95a3d853a5db1c1ab4.jpg inflating: dataset/train/bald_eagle/96921ca3630814eebd.jpg inflating: dataset/train/bald_eagle/9696dc2b1adcbb23ce.jpg inflating: dataset/train/bald_eagle/96c2a4691a838bf01a.jpg inflating: dataset/train/bald_eagle/96d8b7a9b6b2cb23ef.jpg inflating: dataset/train/bald_eagle/97a5e8c887cc99aa68.jpg inflating: dataset/train/bald_eagle/97c20a3646871939e6.jpg inflating: dataset/train/bald_eagle/983487b43103c2703c.jpg inflating: dataset/train/bald_eagle/994aca9ea42890eedc.jpg inflating: dataset/train/bald_eagle/9acb9249c825452e6e.jpg inflating: dataset/train/bald_eagle/9b741b96d80ebb93a5.PNG inflating: dataset/train/bald_eagle/9c070ff67269473715.jpg inflating: dataset/train/bald_eagle/9c6397216e17f3d7e4.jpg inflating: dataset/train/bald_eagle/9d251ffc4c4b3853a0.jpg inflating: dataset/train/bald_eagle/9da494f6dfc05e620b.jpg inflating: dataset/train/bald_eagle/9dcaf8ebed62720591.jpg inflating: dataset/train/bald_eagle/9f1d84de2323fa241c.jpg inflating: dataset/train/bald_eagle/9f788f1588bc020042.jpg inflating: dataset/train/bald_eagle/9fe5e66b8afd6297fc.jpg inflating: dataset/train/bald_eagle/a0202bb4d7294895bb.png inflating: dataset/train/bald_eagle/a03617fb7fc76fe181.jpg inflating: dataset/train/bald_eagle/a05c22a2aa9d585235.jpg inflating: dataset/train/bald_eagle/a06fc50e7024e16f17.jpg inflating: dataset/train/bald_eagle/a09e9bac7265d63143.jpg inflating: dataset/train/bald_eagle/a100ff375fd5499b6d.jpg inflating: dataset/train/bald_eagle/a13a99d997bc7c6764.jpg inflating: dataset/train/bald_eagle/a1df1c2fbe40fcbc9e.jpg inflating: dataset/train/bald_eagle/a204fa1d7d0bcbfc1c.jpg inflating: dataset/train/bald_eagle/a25e1b191334e03cb9.jpg inflating: dataset/train/bald_eagle/a2d2d3534a863287dd.jpg inflating: dataset/train/bald_eagle/a37f51c86ec57885fe.jpg inflating: dataset/train/bald_eagle/a41fbe7200c6695dea.jpg inflating: dataset/train/bald_eagle/a47cb3b83336611a97.jpg inflating: dataset/train/bald_eagle/a4a1e52b39e1dc9d37.jpg inflating: dataset/train/bald_eagle/a68500a16f7791e36a.jpg inflating: dataset/train/bald_eagle/a72504ccb43f6fdcc8.jpg inflating: dataset/train/bald_eagle/a75b6f819728823b25.gif inflating: dataset/train/bald_eagle/a7ca8513ec7e11ff11.jpg inflating: dataset/train/bald_eagle/a856c4c03814dcbb79.jpg inflating: dataset/train/bald_eagle/a9699a6a3a2e0516f2.jpg inflating: dataset/train/bald_eagle/a9a91c40088a8df78d.jpg inflating: dataset/train/bald_eagle/a9d5c28d56c6b3d2d0.jpg inflating: dataset/train/bald_eagle/aa4b500dd3ed831233.png inflating: dataset/train/bald_eagle/aaad35175a79e357fa.jpg inflating: dataset/train/bald_eagle/abcf228cd8374812e2.jpg inflating: dataset/train/bald_eagle/ac0c060996c586fbd5.jpeg inflating: dataset/train/bald_eagle/acf43023d193d8d2a4.jpg inflating: dataset/train/bald_eagle/ad83e1cd1c7d38d1c4.jpg inflating: dataset/train/bald_eagle/adca8e46a752c18636.jpg inflating: dataset/train/bald_eagle/ae17098e5c7ba694ed.jpg inflating: dataset/train/bald_eagle/ae295dcf1479201f4c.jpg inflating: dataset/train/bald_eagle/ae2a8575ebefb4eaac.jpg inflating: dataset/train/bald_eagle/aec9f642d6b3b4d2ca.jpg inflating: dataset/train/bald_eagle/af3c7d1a15886a1b31.jpg inflating: dataset/train/bald_eagle/af8aa00db115d1da7e.jpg inflating: dataset/train/bald_eagle/b033d6f9edfd1f6ab7.jpg inflating: dataset/train/bald_eagle/b042142398541a87a1.jpg inflating: dataset/train/bald_eagle/b0ee024aa2565daeb7.jpg inflating: dataset/train/bald_eagle/b16464d79271c50ce0.jpg inflating: dataset/train/bald_eagle/b190535cd34bbe3f89.jpg inflating: dataset/train/bald_eagle/b1ea55ec611cde192c.jpg inflating: dataset/train/bald_eagle/b314021a0b9b30824e.jpg inflating: dataset/train/bald_eagle/b391e5e7703ea39893.jpg inflating: dataset/train/bald_eagle/b397a55acfd6f3a8ab.jpg inflating: dataset/train/bald_eagle/b443c61f41c13855cb.jpg inflating: dataset/train/bald_eagle/b46029d7cce54f8d55.png inflating: dataset/train/bald_eagle/b4ade933f48ddeb6c3.jpeg inflating: dataset/train/bald_eagle/b4b00d01cd4e0bdb74.jpg inflating: dataset/train/bald_eagle/b505c257e0ab8981c5.png inflating: dataset/train/bald_eagle/b5f49d2459ca88bd36.jpg inflating: dataset/train/bald_eagle/b6662f3df9a8da0ac0.jpg inflating: dataset/train/bald_eagle/b6e6d90254956f4381.png inflating: dataset/train/bald_eagle/b6f81dea93b3d03320.jpg inflating: dataset/train/bald_eagle/b8dd0b8c44d69f235a.jpg inflating: dataset/train/bald_eagle/b906de9a3f48ed1c47.jpg inflating: dataset/train/bald_eagle/b910fc4b719462be78.jpg inflating: dataset/train/bald_eagle/b9571993ad1fa13162.jpg inflating: dataset/train/bald_eagle/b95eddf1fec9fbdfd2.jpg inflating: dataset/train/bald_eagle/b9e9ed4c22615f009f.jpg inflating: dataset/train/bald_eagle/ba2e5c931552483cfa.jpg inflating: dataset/train/bald_eagle/ba827bc82b992cf389.jpg inflating: dataset/train/bald_eagle/ba8fb13fe00ecaa052.jpg inflating: dataset/train/bald_eagle/bb74c78bb4ef53038f.jpg inflating: dataset/train/bald_eagle/bb9ac55eb9d86461ee.jpg inflating: dataset/train/bald_eagle/bba8a0d77ab431a309.jpg inflating: dataset/train/bald_eagle/bc81ec2601ece1e850.jpg inflating: dataset/train/bald_eagle/bd6ff831745d0204b0.jpg inflating: dataset/train/bald_eagle/be2402c7e90dc805f2.jpg inflating: dataset/train/bald_eagle/be567746cb3bb77eb5.jpg inflating: dataset/train/bald_eagle/bef0f53ec723d02d11.jpg inflating: dataset/train/bald_eagle/befe221e0a5e75ca69.jpg inflating: dataset/train/bald_eagle/bf3e53b9eed4c6b914.jpg inflating: dataset/train/bald_eagle/bf47697005c6eb8c19.jpg inflating: dataset/train/bald_eagle/bf7613e8ea3cf84438.jpg inflating: dataset/train/bald_eagle/bf77d6b2206308a2ae.jpg inflating: dataset/train/bald_eagle/bf9c0db2a38217876d.jpg inflating: dataset/train/bald_eagle/bfb43a8f190fff4756.jpg inflating: dataset/train/bald_eagle/c13dc40b77c1029f5e.JPG inflating: dataset/train/bald_eagle/c1b7a126134ea5adc1.jpg inflating: dataset/train/bald_eagle/c326d12ec09dd5ee20.jpg inflating: dataset/train/bald_eagle/c40992227b8b813b96.jpg inflating: dataset/train/bald_eagle/c45a9248b58bdd0b67.jpg inflating: dataset/train/bald_eagle/c486d85a88826bd899.jpg inflating: dataset/train/bald_eagle/c4e130980d8b1d1c50.jpg inflating: dataset/train/bald_eagle/c537ea3973c415d6c0.jpg inflating: dataset/train/bald_eagle/c538841d074665e38f.jpg inflating: dataset/train/bald_eagle/c53f33c6fc2b077fae.JPG inflating: dataset/train/bald_eagle/c615b239875d7dcabd.jpg inflating: dataset/train/bald_eagle/c70af44b6a8a2edbe0.jpg inflating: dataset/train/bald_eagle/c82bef5c02a41f926f.jpg inflating: dataset/train/bald_eagle/c8783c21199ef6b4f5.jpg inflating: dataset/train/bald_eagle/c94f1e6683d6d3f954.jpg inflating: dataset/train/bald_eagle/c95ffc8076de9ccbcc.jpg inflating: dataset/train/bald_eagle/ca4a7e9ea5c0c2bdf6.jpg inflating: dataset/train/bald_eagle/ca9396b70044faa488.jpeg inflating: dataset/train/bald_eagle/caaae5a1040e6dc545.jpg inflating: dataset/train/bald_eagle/cc4c359fbbb6eb406c.png inflating: dataset/train/bald_eagle/cd3e26203fb487e18f.jpg inflating: dataset/train/bald_eagle/ce3c81aca988fc3ebf.jpg inflating: dataset/train/bald_eagle/cec7bd89d761e5f0d5.jpg inflating: dataset/train/bald_eagle/cef6f98c9cc0379013.jpg inflating: dataset/train/bald_eagle/d0d4acb7a58fd89d83.jpg inflating: dataset/train/bald_eagle/d11ec0027aea3ada1a.jpg inflating: dataset/train/bald_eagle/d1bd79429f683269fb.jpg inflating: dataset/train/bald_eagle/d1bef129e8c9c44f3b.jpg inflating: dataset/train/bald_eagle/d1d8f40ba03f80bf25.jpg inflating: dataset/train/bald_eagle/d282fb9041e4ced2cc.jpg inflating: dataset/train/bald_eagle/d288c7039d631190cb.jpg inflating: dataset/train/bald_eagle/d30238e30e09b61e3d.jpg inflating: dataset/train/bald_eagle/d3151ba6e4aed2120b.jpg inflating: dataset/train/bald_eagle/d3498969db80f813e6.jpg inflating: dataset/train/bald_eagle/d3932add43a9231b74.jpg inflating: dataset/train/bald_eagle/d5c3d37041f6f9b042.jpg inflating: dataset/train/bald_eagle/d61694084bc6ca7946.jpg inflating: dataset/train/bald_eagle/d695d99287f969deec.jpg inflating: dataset/train/bald_eagle/d6da2e6b0a3c5316f4.jpg inflating: dataset/train/bald_eagle/d6ef3fa214e2502176.jpg inflating: dataset/train/bald_eagle/d6f928b4b7982263aa.jpg inflating: dataset/train/bald_eagle/d7a330ade2eb140ff0.png inflating: dataset/train/bald_eagle/d7a59b6e4b8307a9ec.jpg inflating: dataset/train/bald_eagle/d7fc7affd247048674.jpg inflating: dataset/train/bald_eagle/d80f2e79df3f656c96.JPG inflating: dataset/train/bald_eagle/d82f49e046a901824f.jpg inflating: dataset/train/bald_eagle/d846ef46b91f8ac7d0.jpg inflating: dataset/train/bald_eagle/d8894e205d01217ca0.jpg inflating: dataset/train/bald_eagle/d88ac8f1360a1c674c.jpg inflating: dataset/train/bald_eagle/d8a779714a3c020881.jpg inflating: dataset/train/bald_eagle/d8cae10c91f4a6eb2d.jpg inflating: dataset/train/bald_eagle/d912989b77a8349ccb.jpg inflating: dataset/train/bald_eagle/da06d5fdc9657043a7.jpg inflating: dataset/train/bald_eagle/dc1882a712d9db3151.jpg inflating: dataset/train/bald_eagle/dc799307bc8db4225e.jpg inflating: dataset/train/bald_eagle/dc81937117cbc87579.JPG inflating: dataset/train/bald_eagle/dca6169d0696eef3d3.jpg inflating: dataset/train/bald_eagle/dd43830408f452e6c0.jpg inflating: dataset/train/bald_eagle/ddec1253e167132e69.jpg inflating: dataset/train/bald_eagle/de0df476567774862c.jpg inflating: dataset/train/bald_eagle/debb619c4ca8674c6d.jpg inflating: dataset/train/bald_eagle/dedafea7ff74f495ef.jpg inflating: dataset/train/bald_eagle/df31b688fb4146035c.jpg inflating: dataset/train/bald_eagle/e014e230c1d2c57e36.jpg inflating: dataset/train/bald_eagle/e022e698383c380d93.jpg inflating: dataset/train/bald_eagle/e12e333c89f183686f.jpg inflating: dataset/train/bald_eagle/e158e8ef7754bd0c0f.png inflating: dataset/train/bald_eagle/e18aa064895ecedb92.jpg inflating: dataset/train/bald_eagle/e294e28b0eafc351ab.jpg inflating: dataset/train/bald_eagle/e2b1fe9716bef8451c.jpg inflating: dataset/train/bald_eagle/e3fbb559392be6126a.jpg inflating: dataset/train/bald_eagle/e4365bf6f4e3edcbcb.jpg inflating: dataset/train/bald_eagle/e499f3bdded72a7e4e.png inflating: dataset/train/bald_eagle/e578623a10b37ed451.jpg inflating: dataset/train/bald_eagle/e5be781cef23a9e390.jpg inflating: dataset/train/bald_eagle/e61a68f85de3a979e3.jpg inflating: dataset/train/bald_eagle/e6a3e7c7ae3e97385b.jpg inflating: dataset/train/bald_eagle/e6b735593113dfee7e.jpg inflating: dataset/train/bald_eagle/e6f975deafba645e46.jpg inflating: dataset/train/bald_eagle/e74d2cdfe3e8bfc466.jpg inflating: dataset/train/bald_eagle/e77e40c84d09884190.jpg inflating: dataset/train/bald_eagle/e791f408326c0a3a4a.jpg inflating: dataset/train/bald_eagle/e84d450b58bb2d11e1.jpg inflating: dataset/train/bald_eagle/e932de599967cd3304.jpg inflating: dataset/train/bald_eagle/e952bc6927fe373c5c.jpg inflating: dataset/train/bald_eagle/e986d6a907844ae1c1.jpg inflating: dataset/train/bald_eagle/e9b4f5d6f3093b1064.jpg inflating: dataset/train/bald_eagle/e9ebcb649936f8e819.jpg inflating: dataset/train/bald_eagle/ea04dbe7471e489dc4.jpg inflating: dataset/train/bald_eagle/ea5c90254de087b174.jpg inflating: dataset/train/bald_eagle/eab4ef6365be264eed.jpg inflating: dataset/train/bald_eagle/eb8157e69c0192fd47.jpg inflating: dataset/train/bald_eagle/ec0e3cb63fc228fba7.jpg inflating: dataset/train/bald_eagle/edd1a33410d305e542.jpg inflating: dataset/train/bald_eagle/ee3bd3c85f68fb4fde.jpg inflating: dataset/train/bald_eagle/ee7c0d33e74daa0b7b.png inflating: dataset/train/bald_eagle/ef040d321bd48fe476.jpg inflating: dataset/train/bald_eagle/ef53500091a17787a2.jpg inflating: dataset/train/bald_eagle/efd0b3fd435531bbdc.jpg inflating: dataset/train/bald_eagle/effbeeacb0ba3c6e04.jpg inflating: dataset/train/bald_eagle/f0d78f355acc85d973.jpg inflating: dataset/train/bald_eagle/f1004260a963c21835.JPG inflating: dataset/train/bald_eagle/f181a477bf27d920ad.jpg inflating: dataset/train/bald_eagle/f214381a41fa8d20cb.jpg inflating: dataset/train/bald_eagle/f2ff77f320ae2f6e26.jpg inflating: dataset/train/bald_eagle/f3265b229ee8da8f97.jpg inflating: dataset/train/bald_eagle/f33d7dbf0651deace7.jpg inflating: dataset/train/bald_eagle/f3dfc1ad4e3c3da721.jpg inflating: dataset/train/bald_eagle/f419dc977da3babd24.jpg inflating: dataset/train/bald_eagle/f4a915f01424d75a16.png inflating: dataset/train/bald_eagle/f50db1e15362eed5b8.jpg inflating: dataset/train/bald_eagle/f5ee97e9e9c0f0dc6a.jpg inflating: dataset/train/bald_eagle/f63341d47260e6c11c.jpg inflating: dataset/train/bald_eagle/f7409edb19e8bb1b20.jpg inflating: dataset/train/bald_eagle/f8230ed79eac84cdc3.jpg inflating: dataset/train/bald_eagle/f8273a36d2819d2969.jpg inflating: dataset/train/bald_eagle/f85c73f249ed77ec2b.jpg inflating: dataset/train/bald_eagle/f9221e80da78592b56.jpg inflating: dataset/train/bald_eagle/f94d2b02191ced483b.jpg inflating: dataset/train/bald_eagle/f9d021b90ff0c6892f.jpg inflating: dataset/train/bald_eagle/fa9828487faeb7f267.jpg inflating: dataset/train/bald_eagle/fb2dd69ef3d6b2b341.jpg inflating: dataset/train/bald_eagle/fc11bf5651c77c725e.jpg inflating: dataset/train/bald_eagle/fc85fbb65a4915109b.jpg inflating: dataset/train/bald_eagle/fd27118f3ca9439962.jpg inflating: dataset/train/bald_eagle/fda443d247c43c0f30.jpg inflating: dataset/train/bald_eagle/fdb5c3ad01275e10d3.jpg inflating: dataset/train/bald_eagle/fe07d6641bcb1c0ad3.JPG inflating: dataset/train/bald_eagle/ff555884bbceee678b.jpg inflating: dataset/train/bald_eagle/ffc250eb6563e6dfea.jpg creating: dataset/train/elk/ inflating: dataset/train/elk/00b36f2e67e9d96911.jpg inflating: dataset/train/elk/00c8f9109f924ec4ae.jpg inflating: dataset/train/elk/024874055983c9d9ca.jpg inflating: dataset/train/elk/0251f95a2a6d25ab67.jpg inflating: dataset/train/elk/02cdca06cb5cb63994.jpg inflating: dataset/train/elk/040bb30138ef814ae6.jpg inflating: dataset/train/elk/0445930c36244d8553.jpg inflating: dataset/train/elk/04d62673e040fc1b8a.jpg inflating: dataset/train/elk/054ca94dcca55f2be8.jpg inflating: dataset/train/elk/06017ec9841c057ff2.jpg inflating: dataset/train/elk/0655578cbba9777056.jpg inflating: dataset/train/elk/075089ac22251ea22e.jpg inflating: dataset/train/elk/0836dcc9ee651fb9be.jpg inflating: dataset/train/elk/0a1ca788802fc705b1.jpg inflating: dataset/train/elk/0a36c2c06c07efa93b.jpg inflating: dataset/train/elk/0ab2df17dd47e10607.jpg inflating: dataset/train/elk/0adec38cdfb6df5175.jpg inflating: dataset/train/elk/0c25af90e7c13db7e8.jpg inflating: dataset/train/elk/0c577c28a636166314.jpg inflating: dataset/train/elk/0c64d8031309df772c.jpg inflating: dataset/train/elk/0ca29d03906e71ceca.jpg inflating: dataset/train/elk/0d01914aa96b8faacc.jpg inflating: dataset/train/elk/0d47a22210c8b94dd3.jpg inflating: dataset/train/elk/0d4da796f4f89d7167.jpg inflating: dataset/train/elk/0d5fba8c034d045246.jpg inflating: dataset/train/elk/0dc8ae158f2858224c.jpg inflating: dataset/train/elk/0eb855f3e53a7995f9.jpg inflating: dataset/train/elk/0ef7cd207dd3a4f7d7.jpg inflating: dataset/train/elk/0f64f81cb59cabe08a.jpg inflating: dataset/train/elk/10b378f0160f780125.jpg inflating: dataset/train/elk/10bd69956667593efa.jpg inflating: dataset/train/elk/11ad825cad45020cd9.jpg inflating: dataset/train/elk/12141a1d1dbbfa30eb.jpg inflating: dataset/train/elk/13836f58294da915a7.jpg inflating: dataset/train/elk/13b9fdb4e800ff8f8b.jpeg inflating: dataset/train/elk/14066556764cd2e1e2.jpg inflating: dataset/train/elk/144d3c1304bf8ec6e2.jpg inflating: dataset/train/elk/162a2c38429493b8b9.png inflating: dataset/train/elk/16398ae670edb9a01f.jpg inflating: dataset/train/elk/167cc0358a2a56c2dc.jpg inflating: dataset/train/elk/174038092c6dd5162a.jpg inflating: dataset/train/elk/17fea56b35e6e6d04d.jpg inflating: dataset/train/elk/18a0081b374a069bc9.jpg inflating: dataset/train/elk/18cac24eee6f13b7af.jpg inflating: dataset/train/elk/197b2e9f7ea4a3e159.jpg inflating: dataset/train/elk/1a3221aedec7c7efb5.jpg inflating: dataset/train/elk/1a8e5f92d4a68513db.jpg inflating: dataset/train/elk/1c3e6aa909ab2de76e.jpg inflating: dataset/train/elk/1d9e90ca517171ecdf.JPG inflating: dataset/train/elk/1e7311d7b63d7d1758.jpg inflating: dataset/train/elk/1ef87d0bcd521b8b16.png inflating: dataset/train/elk/1f3465b3bf64e74fed.jpg inflating: dataset/train/elk/20cb9850f38bc08026.jpg inflating: dataset/train/elk/2102fd6a00c3900776.jpg inflating: dataset/train/elk/218b262555924fc866.jpg inflating: dataset/train/elk/2195362b6ca2a0635c.jpg inflating: dataset/train/elk/21ee799bcf22923a60.jpg inflating: dataset/train/elk/224cc52bb766b97d02.jpg inflating: dataset/train/elk/22776af9db9dd15024.jpg inflating: dataset/train/elk/22f82e3a674202ed68.jpg inflating: dataset/train/elk/233e17a9166222dce7.jpg inflating: dataset/train/elk/23ea9a03be144738e0.jpg inflating: dataset/train/elk/252d3ba349a845e298.jpg inflating: dataset/train/elk/25a97b21a2248c59cd.jpg inflating: dataset/train/elk/260c368d00c635c3e5.jpg inflating: dataset/train/elk/26912929c98906c863.jpg inflating: dataset/train/elk/26959155b2765fec8c.jpg inflating: dataset/train/elk/27e5c7949d52f82879.jpg inflating: dataset/train/elk/292a4752465b97a17c.jpg inflating: dataset/train/elk/29e7c7f6bba4489d59.jpg inflating: dataset/train/elk/2b1fdb3c9b6d94fee7.jpg inflating: dataset/train/elk/2b23162b90bad17689.jpg inflating: dataset/train/elk/2b63b879d23abbfad2.jpeg inflating: dataset/train/elk/2b89b820fd9dece1ac.jpg inflating: dataset/train/elk/2c359aff7475d14694.jpg inflating: dataset/train/elk/2d3ca7d8fd1daa03f5.jpg inflating: dataset/train/elk/2d92accb8c8ef44dd3.jpg inflating: dataset/train/elk/2e5b6ceaff45783152.jpg inflating: dataset/train/elk/2e61cfcb9f9d999588.jpg inflating: dataset/train/elk/2ea6c6b7b3eb277e88.jpg inflating: dataset/train/elk/2eb0109e68094e889d.jpg inflating: dataset/train/elk/2eea77e4983d43e483.jpg inflating: dataset/train/elk/2fd87f14d52f7869b4.jpg inflating: dataset/train/elk/302e30e6d48283ebf0.jpg inflating: dataset/train/elk/30d78c324c4aae1900.jpg inflating: dataset/train/elk/312ea9fccd0eb84aad.jpg inflating: dataset/train/elk/3136959840802e11d1.jpg inflating: dataset/train/elk/3172d5cd490cf9ba67.jpg inflating: dataset/train/elk/31d9197f4cfc8cc72e.jpg inflating: dataset/train/elk/3209af85099c5208ab.png inflating: dataset/train/elk/32362b71980772bf64.jpg inflating: dataset/train/elk/329f304f58826e0493.jpg inflating: dataset/train/elk/32b84e41c31db5ade8.jpg inflating: dataset/train/elk/32c416ddbef77628a7.jpg inflating: dataset/train/elk/33a3d6521acc7e1c32.jpg inflating: dataset/train/elk/33d1b088284c750e2e.jpg inflating: dataset/train/elk/3457cdcfff78df42ba.jpg inflating: dataset/train/elk/34e924fc6e7d59b215.jpg inflating: dataset/train/elk/353243ccef67919961.jpg inflating: dataset/train/elk/357010c4b1ce273233.jpg inflating: dataset/train/elk/366ba6ca49b00e2313.jpg inflating: dataset/train/elk/374a2c1e446c751156.jpg inflating: dataset/train/elk/38879d095ea3615578.jpg inflating: dataset/train/elk/3921f34a424f9bd6a7.jpg inflating: dataset/train/elk/396e38913ff88d75f7.jpg inflating: dataset/train/elk/396ea3f1fd6923c2c8.jpg inflating: dataset/train/elk/3adf75be8fee4d436a.jpg inflating: dataset/train/elk/3b84acf139973a6062.jpg inflating: dataset/train/elk/3bf9c3c0ac055c754c.jpg inflating: dataset/train/elk/3c1bf4bc75608ee914.jpg inflating: dataset/train/elk/3c87576d11c0197e84.jpg inflating: dataset/train/elk/3c98d5f672c9971123.jpg inflating: dataset/train/elk/3d1a30abcdc2f619ee.JPG inflating: dataset/train/elk/3d843c729470860845.jpg inflating: dataset/train/elk/3ebb3fc02b6e32f12e.jpg inflating: dataset/train/elk/3f26d6edce3502fade.jpg inflating: dataset/train/elk/3f78b442b4c6cb889d.jpg inflating: dataset/train/elk/3ffb030e2b150c2040.jpg inflating: dataset/train/elk/4047b6eb15bb890192.png inflating: dataset/train/elk/409168e0bfa9496abc.jpg inflating: dataset/train/elk/40bdd48026a5708a52.jpg inflating: dataset/train/elk/40c37aeb2a4dbfff83.jpeg inflating: dataset/train/elk/40ffa6878a6335a98c.jpg inflating: dataset/train/elk/411b5f5981872e5903.jpg inflating: dataset/train/elk/41374726d3324ec741.jpg inflating: dataset/train/elk/41f881b09562a708c6.jpg inflating: dataset/train/elk/43876ef362a0332d57.jpg inflating: dataset/train/elk/443b4079744dbee682.png inflating: dataset/train/elk/445f99f2d363ad2a69.jpg inflating: dataset/train/elk/44b375c9573b87c596.jpg inflating: dataset/train/elk/44b63eecd99ea1e0ab.jpg inflating: dataset/train/elk/4563a7036796baa2de.jpg inflating: dataset/train/elk/46f89aa58aeaec1515.jpg inflating: dataset/train/elk/4714177f7764240646.jpg inflating: dataset/train/elk/4822c88302995b8d1c.jpg inflating: dataset/train/elk/4867658bad58254261.jpg inflating: dataset/train/elk/4b1aacda0ecf451ff6.png inflating: dataset/train/elk/4b659114128ac48a96.jpg inflating: dataset/train/elk/4bad6f613c367a21ca.jpg inflating: dataset/train/elk/4bc64b3456674fc426.jpg inflating: dataset/train/elk/4d8dbc27c0ea33f2f9.jpg inflating: dataset/train/elk/4db6ff5a4d0758c2dc.jpg inflating: dataset/train/elk/4e122bf91c1828f781.jpg inflating: dataset/train/elk/4e6590ee0f76269a9e.jpg inflating: dataset/train/elk/4f1997b8026628d029.jpg inflating: dataset/train/elk/4fc272b65fd1cc5200.jpg inflating: dataset/train/elk/4ffbca5df2acae689b.jpg inflating: dataset/train/elk/502a6e4d7486bcd0bd.jpg inflating: dataset/train/elk/50f8a62a96073d74f5.jpg inflating: dataset/train/elk/51671c83220d8acc7e.jpg inflating: dataset/train/elk/5308fc16032da9f004.png inflating: dataset/train/elk/537cd67655085b617f.jpg inflating: dataset/train/elk/53b877325a3b7e160f.jpg inflating: dataset/train/elk/5461efc529a1920463.jpg inflating: dataset/train/elk/54c8ad42c6d16e1eb1.jpg inflating: dataset/train/elk/55132a99bc20d7027e.jpg inflating: dataset/train/elk/56997a033f7c55156a.jpg inflating: dataset/train/elk/569a1fceee850b3074.jpg inflating: dataset/train/elk/56a3d98e5006a68cae.jpg inflating: dataset/train/elk/579dd2d49f15ba8ae2.jpg inflating: dataset/train/elk/57afffcd7da610be86.jpg inflating: dataset/train/elk/57c8f8586f83579cf9.jpg inflating: dataset/train/elk/57ea2943e2e2540213.png inflating: dataset/train/elk/582c0c88c3cda35543.jpg inflating: dataset/train/elk/59e872f0efc4c571e5.jpg inflating: dataset/train/elk/5a069695c80de1cfcc.jpg inflating: dataset/train/elk/5aa003dc7638317963.jpg inflating: dataset/train/elk/5af27c8213ceba8804.jpg inflating: dataset/train/elk/5af657ce20d5a44b99.jpg inflating: dataset/train/elk/5affc564162ef4b10b.jpg inflating: dataset/train/elk/5b02782986e4343356.jpg inflating: dataset/train/elk/5c2cef21e0bbc721f1.jpg inflating: dataset/train/elk/5cc4fd2b1bdb890607.jpg inflating: dataset/train/elk/5d2c7b541ac36db458.jpeg inflating: dataset/train/elk/5d44d2796477a8c4db.jpg inflating: dataset/train/elk/5ef329a839c73fe852.jpg inflating: dataset/train/elk/605c142eb269d796e8.jpg inflating: dataset/train/elk/60d0b948b42f5c14ac.jpg inflating: dataset/train/elk/60e03fce24d6367dd2.jpg inflating: dataset/train/elk/615766e9a49a7efc58.jpg inflating: dataset/train/elk/615fa34b293a78c44b.jpg inflating: dataset/train/elk/6198ce05a54746de24.jpg inflating: dataset/train/elk/61f9ab94768a37132a.jpg inflating: dataset/train/elk/629184fc2b85454038.jpg inflating: dataset/train/elk/632b2603f013b586e8.jpg inflating: dataset/train/elk/6526b38a44da5a57a6.jpg inflating: dataset/train/elk/657925092a93e10bb0.jpg inflating: dataset/train/elk/658b65b036423eb140.jpg inflating: dataset/train/elk/65b63e1aa5863fae98.jpg inflating: dataset/train/elk/65cd0e00942ef91adf.jpg inflating: dataset/train/elk/670839da88fdc7f5e7.jpg inflating: dataset/train/elk/671bd7e67937c1cc01.jpg inflating: dataset/train/elk/672a44030db4762171.jpg inflating: dataset/train/elk/676001b913f4711f3d.jpg inflating: dataset/train/elk/678ef38b5a703831f2.jpg inflating: dataset/train/elk/691dc411c37fe77b40.jpg inflating: dataset/train/elk/6981544031de5c1c39.jpg inflating: dataset/train/elk/6a0a1a3149bd071944.jpg inflating: dataset/train/elk/6a41a9967c770e9134.jpg inflating: dataset/train/elk/6aeae25c25b665d2fe.jpg inflating: dataset/train/elk/6b29ce5a71a4118962.jpg inflating: dataset/train/elk/6bc385743153474221.JPG inflating: dataset/train/elk/6c153ff0b0314899b0.jpg inflating: dataset/train/elk/6c6bef9e94e25dc815.jpg inflating: dataset/train/elk/6cbe8debb7b4015e8b.jpg inflating: dataset/train/elk/6d5ebac1abbf0b6fd9.jpg inflating: dataset/train/elk/6d91dfca4e4ab7d299.jpg inflating: dataset/train/elk/6e6a4501746064b1cf.jpg inflating: dataset/train/elk/6f3f3281f99b8b901e.jpg inflating: dataset/train/elk/6fc1ca8de956c12573.jpg inflating: dataset/train/elk/702005ab7088061310.jpg inflating: dataset/train/elk/710494c7a43abfa92d.jpg inflating: dataset/train/elk/7123d729f9e1d61e9f.jpg inflating: dataset/train/elk/71d43626a126a4c8c0.jpg inflating: dataset/train/elk/721679837d653f224d.jpg inflating: dataset/train/elk/7387e6b34e601a5ae8.jpg inflating: dataset/train/elk/73b1f977dc456bf968.jpg inflating: dataset/train/elk/73d6efebdd742c2a14.jpg inflating: dataset/train/elk/73eac624f0450c04de.jpg inflating: dataset/train/elk/746e9352dc9bc557b1.jpg inflating: dataset/train/elk/75ffe4af05d42e7c97.jpg inflating: dataset/train/elk/7671df8229f42e6a31.jpg inflating: dataset/train/elk/79a8a765e43c4032b5.jpg inflating: dataset/train/elk/7a55ccc0ab0c8f39ad.jpg inflating: dataset/train/elk/7e840a45f4dd1665e3.jpg inflating: dataset/train/elk/7f03a020a334781205.jpg inflating: dataset/train/elk/7f11e946ffdaafe52b.jpg inflating: dataset/train/elk/80f7f819d4635f95c5.jpg inflating: dataset/train/elk/81f0e16d377512de6b.jpg inflating: dataset/train/elk/823691b0688d422c77.jpg inflating: dataset/train/elk/8273554e5a0e241ce4.jpg inflating: dataset/train/elk/83d02822a233ab97a4.jpg inflating: dataset/train/elk/84cdf378646f7b7efd.jpg inflating: dataset/train/elk/859673c484ef964b1d.jpg inflating: dataset/train/elk/85c41a294d295396f8.jpg inflating: dataset/train/elk/86e28f40cd421859d4.jpg inflating: dataset/train/elk/89a5da1c95c2122847.jpg inflating: dataset/train/elk/8aa0609a80a508e246.jpg inflating: dataset/train/elk/8b0b70313827ab07c0.jpg inflating: dataset/train/elk/8c750e9233b717cc63.jpg inflating: dataset/train/elk/8ca4c2dbfa00d37dce.jpg inflating: dataset/train/elk/8d9f43d11589668d3d.jpg inflating: dataset/train/elk/8e97c75174490c20bd.jpg inflating: dataset/train/elk/901c3e9ac17045cd2b.jpg inflating: dataset/train/elk/909b171fb150221860.jpg inflating: dataset/train/elk/912289b232a54d379f.jpg inflating: dataset/train/elk/916c140255efad0bc4.jpg inflating: dataset/train/elk/9193a4bc9e0c5e96b7.jpg inflating: dataset/train/elk/9234a48088c70d2dd0.jpg inflating: dataset/train/elk/923a03b2a825638b60.jpg inflating: dataset/train/elk/9259ce2f43a437849e.JPG inflating: dataset/train/elk/92d405eb17a43646b2.jpg inflating: dataset/train/elk/934547dc6173488514.jpg inflating: dataset/train/elk/9509f082044743ce69.jpg inflating: dataset/train/elk/954768deb2c20305a0.jpg inflating: dataset/train/elk/96f1842e858b21e369.jpg inflating: dataset/train/elk/9776085325c5c8dbb5.jpg inflating: dataset/train/elk/978e99383036393b7e.jpg inflating: dataset/train/elk/983985fba37f986dc5.jpg inflating: dataset/train/elk/98c0f151318fce45f5.jpg inflating: dataset/train/elk/98cf7da9eeb7edfd1f.jpg inflating: dataset/train/elk/991e98210afb960ee9.jpeg inflating: dataset/train/elk/994b383050424c1555.jpg inflating: dataset/train/elk/9ad41f661691a6307c.jpg inflating: dataset/train/elk/9c22c891949de790df.jpg inflating: dataset/train/elk/9c9a9bfc2daf1e8791.jpg inflating: dataset/train/elk/9d09fb02a11f4fc306.jpg inflating: dataset/train/elk/9df1f2324a108c1d19.jpg inflating: dataset/train/elk/9e1633cd7b68ebdef1.jpg inflating: dataset/train/elk/9e44aa9cca0a898b1b.jpg inflating: dataset/train/elk/9e7782503845d883f9.jpg inflating: dataset/train/elk/9efe8ff296393bf863.jpg inflating: dataset/train/elk/9f1a506068f6714777.jpg inflating: dataset/train/elk/9f6e59a7e4be3a7d03.jpg inflating: dataset/train/elk/a0573c04ad8615dadf.jpg inflating: dataset/train/elk/a06d6e20e26ec49b43.jpg inflating: dataset/train/elk/a0ca1fb3a81cea6704.jpg inflating: dataset/train/elk/a3d358c799468c5489.jpg inflating: dataset/train/elk/a4aca69c83392239f3.png inflating: dataset/train/elk/a54a589b6eb66611e1.jpg inflating: dataset/train/elk/a5cfc0e3183b75c371.jpg inflating: dataset/train/elk/a618032e06fa50d36a.jpg inflating: dataset/train/elk/a6ce1c7598c0fa84de.jpg inflating: dataset/train/elk/a73e517a698bdb95c1.jpg inflating: dataset/train/elk/a7466aa1ef9f1806b5.jpg inflating: dataset/train/elk/a768638ca8c406b77d.jpg inflating: dataset/train/elk/a7acaeabdce4d8b889.jpg inflating: dataset/train/elk/a7c897b58286db01cc.jpg inflating: dataset/train/elk/a8d939aa59e1cf7eca.jpg inflating: dataset/train/elk/a971ec2e76077eefe7.jpg inflating: dataset/train/elk/a97cfcbcc36437be4d.jpg inflating: dataset/train/elk/aa87d81db0569b4568.jpg inflating: dataset/train/elk/aad827d441c5e4644e.jpg inflating: dataset/train/elk/ab6233f6563fbb8223.jpg inflating: dataset/train/elk/ac786646b8a2bd1e4b.jpg inflating: dataset/train/elk/acbf7f384e508fce61.jpg inflating: dataset/train/elk/ad3c39d975396de886.jpg inflating: dataset/train/elk/ae632eb4275c7f6f70.jpg inflating: dataset/train/elk/afad672ca7f98fda7f.jpg inflating: dataset/train/elk/b111b6fa5e315f78e0.jpg inflating: dataset/train/elk/b1c1f2253592708926.jpg inflating: dataset/train/elk/b1db804414a3b824ed.jpg inflating: dataset/train/elk/b2d3605066e178dd64.jpg inflating: dataset/train/elk/b2e92973052f1b4425.jpg inflating: dataset/train/elk/b3f1540f7f9beeb924.jpg inflating: dataset/train/elk/b48b753f3fe8b5a516.jpg inflating: dataset/train/elk/b501ec1f1e8058fdc2.jpg inflating: dataset/train/elk/b52dce2a1d58d8b009.jpg inflating: dataset/train/elk/b57298d59f93800fae.jpg inflating: dataset/train/elk/b5de9a6c17bac90eaf.jpg inflating: dataset/train/elk/b619b3cf11c68099d4.jpg inflating: dataset/train/elk/b6a5c5b3fa3427f138.JPG inflating: dataset/train/elk/b6b5ea102a6cfb39ce.jpg inflating: dataset/train/elk/b71294ef467eca0d5a.jpg inflating: dataset/train/elk/b77700fb7e94f6f0ad.jpg inflating: dataset/train/elk/b831e358f3c791ca46.jpg inflating: dataset/train/elk/b83a664e3184dbd429.jpg inflating: dataset/train/elk/b886064de28ae78e8e.jpg inflating: dataset/train/elk/b8e28989b71c2ccb5b.jpg inflating: dataset/train/elk/b906ff416c2d0a45aa.jpg inflating: dataset/train/elk/b984e832d28e3bc82f.jpg inflating: dataset/train/elk/bcf9c2d96f507a1edf.jpg inflating: dataset/train/elk/bd66a3a8be3412705d.jpg inflating: dataset/train/elk/bdc2db387612d9bfbb.jpg inflating: dataset/train/elk/be32646b34366ee010.jpg inflating: dataset/train/elk/be530ec216e7cfd068.jpg inflating: dataset/train/elk/be857dbdf5385d67c0.jpg inflating: dataset/train/elk/bea77fee95dfca48bc.jpg inflating: dataset/train/elk/beb1ff798f447d9e9a.jpg inflating: dataset/train/elk/bf80b186c267896f43.jpg inflating: dataset/train/elk/bfacc56b7b4ebdcff5.jpg inflating: dataset/train/elk/c0511181ed0d2ef1ec.jpg inflating: dataset/train/elk/c1196807c2b8424e3d.jpg inflating: dataset/train/elk/c14bfe98b251c7dc2b.jpg inflating: dataset/train/elk/c2361e17dbdc6aa797.jpg inflating: dataset/train/elk/c23c1eda03f61d418e.jpg inflating: dataset/train/elk/c24ebdb8f55bf10a58.JPG inflating: dataset/train/elk/c27317927f91e1024a.jpg inflating: dataset/train/elk/c3bdb995905f9cc4e5.jpg inflating: dataset/train/elk/c4a3e2b789f4885fca.jpg inflating: dataset/train/elk/c526a116ef1963b230.jpg inflating: dataset/train/elk/c55fce82714e3467ab.jpg inflating: dataset/train/elk/c587d83820708901f3.jpg inflating: dataset/train/elk/c588cdd7fd83dbfa33.jpg inflating: dataset/train/elk/c5a82764d42e1f6943.jpg inflating: dataset/train/elk/c69fd72cd968d09203.jpg inflating: dataset/train/elk/c6d96b528584a5e651.jpg inflating: dataset/train/elk/c7b5c1ed5993b18352.jpg inflating: dataset/train/elk/c86044d1bb034229bf.jpg inflating: dataset/train/elk/c8c2ddeaaae8e1f76c.jpg inflating: dataset/train/elk/c8e75d77d410eebc32.jpg inflating: dataset/train/elk/c9016c83b66f82c537.jpg inflating: dataset/train/elk/c9cf527c8da222ffd8.jpg inflating: dataset/train/elk/ca012acd1bf9c4937f.jpg inflating: dataset/train/elk/ca2c42eb0299481979.jpg inflating: dataset/train/elk/ca3911c00c75052c35.jpg inflating: dataset/train/elk/caa4e6cd8205a43e5e.jpg inflating: dataset/train/elk/cb8eda7f0ecba79c50.jpg inflating: dataset/train/elk/cc7fe4303552b30679.jpg inflating: dataset/train/elk/ccf285ab12d6c50476.jpg inflating: dataset/train/elk/cdb2de57e0f15fb621.jpg inflating: dataset/train/elk/cdb69802a32274d7fe.jpg inflating: dataset/train/elk/cdd2a71d1da714a4d9.jpg inflating: dataset/train/elk/ce68d64b445676b7ce.jpg inflating: dataset/train/elk/ceaa9f30f6389393f5.jpg inflating: dataset/train/elk/cee527533f9c59ce4e.jpg inflating: dataset/train/elk/cefd29e347cfa0bc3b.jpg inflating: dataset/train/elk/cf4e8cc80ad39ebf23.jpg inflating: dataset/train/elk/cfaeb441ba33ddb3fa.jpg inflating: dataset/train/elk/cfc74c3fb8d6a84cd8.jpg inflating: dataset/train/elk/d090298d1fd31c5ed2.jpg inflating: dataset/train/elk/d0d0882616289cbb3c.jpg inflating: dataset/train/elk/d201422df4415e2399.jpg inflating: dataset/train/elk/d217e165e66a227584.jpg inflating: dataset/train/elk/d2eb1c9ffa5c2ab502.jpg inflating: dataset/train/elk/d3f410cde2ee5b04de.jpg inflating: dataset/train/elk/d3fcc1ddb5e9804b07.jpg inflating: dataset/train/elk/d44f3b89e107d018ac.jpg inflating: dataset/train/elk/d46e38593c74f42db1.jpg inflating: dataset/train/elk/d5c67098ac868aa4aa.jpg inflating: dataset/train/elk/d610bd9c3068e5a32f.jpg inflating: dataset/train/elk/d6472d039b8153fde2.jpg inflating: dataset/train/elk/d662670412177ec877.jpg inflating: dataset/train/elk/d6b5756c681f0c50e4.jpg inflating: dataset/train/elk/d74a43b10e63148a61.jpg inflating: dataset/train/elk/d792327c0d255384a9.jpg inflating: dataset/train/elk/d809acb2168be8be2d.jpg inflating: dataset/train/elk/d82060f580f332635a.jpg inflating: dataset/train/elk/d82e876c120e5c4195.jpg inflating: dataset/train/elk/d8ac8530abf79646d0.jpg inflating: dataset/train/elk/d8b45815564ddbc2f6.jpg inflating: dataset/train/elk/d982d9e52da0cc5e08.jpg inflating: dataset/train/elk/d9c3637468ab6a95fc.jpg inflating: dataset/train/elk/da234100e8632c0a23.jpg inflating: dataset/train/elk/da50c7972a7c3edfa2.jpg inflating: dataset/train/elk/db5ff86b5512e0b8da.jpg inflating: dataset/train/elk/dbc4e497f605597ac6.jpg inflating: dataset/train/elk/dbdcc0f2de5bedad9b.jpg inflating: dataset/train/elk/dbf2bae967341cc4da.jpg inflating: dataset/train/elk/dc0bd318fe688d22e6.jpg inflating: dataset/train/elk/dc70eed8ec5ff67a38.png inflating: dataset/train/elk/dc9617121981022dc0.jpg inflating: dataset/train/elk/dd732939d800326401.jpg inflating: dataset/train/elk/de074393863ce58429.JPG inflating: dataset/train/elk/df13a6e5de85055912.jpg inflating: dataset/train/elk/e242a1623cb344eae3.jpg inflating: dataset/train/elk/e440b0c4f6fa9ea805.jpg inflating: dataset/train/elk/e4c00926ec4626ad6c.jpg inflating: dataset/train/elk/e53e0f1e16b3a113d1.jpg inflating: dataset/train/elk/e7082638aa28e80a9c.jpg inflating: dataset/train/elk/e7b350fd153312a6f2.jpg inflating: dataset/train/elk/e7d9668deec3c71d36.png inflating: dataset/train/elk/e7db334918ba9b8c2a.jpg inflating: dataset/train/elk/e80c286f4dbfb62f7f.jpg inflating: dataset/train/elk/e83b03c10c050fde82.jpg inflating: dataset/train/elk/e97472c2ab0b32184d.jpg inflating: dataset/train/elk/ea6f5dc46acc90ebee.JPG inflating: dataset/train/elk/eab65a40c30000fb87.jpg inflating: dataset/train/elk/ebe6e7ba83ee901925.jpg inflating: dataset/train/elk/ebea8d5829bf679d75.jpg inflating: dataset/train/elk/ebf3250f9dddbe1d89.jpg inflating: dataset/train/elk/ec335a3bde0ee35ed9.jpg inflating: dataset/train/elk/eca102a656edd14a5f.jpg inflating: dataset/train/elk/ece4034e4d9cdd91a4.jpg inflating: dataset/train/elk/ed2351731b1d260ce5.png inflating: dataset/train/elk/ede5f77fc213f6e838.jpg inflating: dataset/train/elk/eec4de7d34a1540d08.jpg inflating: dataset/train/elk/eef20df2fbc5399d1f.jpg inflating: dataset/train/elk/eefba415c0c489b89a.jpg inflating: dataset/train/elk/ef06b102235ad02b77.jpg inflating: dataset/train/elk/f039fe69ee27f5e71c.jpg inflating: dataset/train/elk/f048b88556dfce68ce.jpg inflating: dataset/train/elk/f0535fec04b6835453.JPG inflating: dataset/train/elk/f0712cd63b137f8a8d.jpg inflating: dataset/train/elk/f204877c4f36ef56a1.jpg inflating: dataset/train/elk/f2122549efddeebf3f.jpg inflating: dataset/train/elk/f22ae756683e8337fe.jpg inflating: dataset/train/elk/f29adb5a032f2d577d.jpg inflating: dataset/train/elk/f2a7866d742eeaca17.jpg inflating: dataset/train/elk/f3a6b85adf29ab3200.jpg inflating: dataset/train/elk/f3f1411d653ae80d6d.jpg inflating: dataset/train/elk/f46b16ab6ee541ea01.jpg inflating: dataset/train/elk/f48aea2bfc1a0200a3.jpg inflating: dataset/train/elk/f48c4df3e3e8d1ac62.jpg inflating: dataset/train/elk/f50404818e3e95e733.jpg inflating: dataset/train/elk/f584b865d00c61a8c4.jpg inflating: dataset/train/elk/f5a83195bf6991ad11.jpg inflating: dataset/train/elk/f5de3a61e05ee3363a.jpg inflating: dataset/train/elk/f602dd2fc2188420ef.jpg inflating: dataset/train/elk/f633830601c2fe3e55.jpg inflating: dataset/train/elk/f685d2fdcb87917b96.jpg inflating: dataset/train/elk/f6db4384321264ea03.jpg inflating: dataset/train/elk/f71526ee34881e3154.jpg inflating: dataset/train/elk/f782d71a61431a4087.jpg inflating: dataset/train/elk/f784d3558f7b3b44d4.jpg inflating: dataset/train/elk/f9c04f9a99ba027bcc.jpg inflating: dataset/train/elk/fa3976787a5b1272ba.jpg inflating: dataset/train/elk/fb36b3d82ba9a87cba.jpg inflating: dataset/train/elk/fbbe68a7a215522b4e.jpg inflating: dataset/train/elk/fd46b744c0347390df.JPG inflating: dataset/train/elk/fe9d828a744779301e.jpg inflating: dataset/train/elk/ff183faaa360f54076.jpg creating: dataset/train/racoon/ inflating: dataset/train/racoon/002c3a60a6c763cdb5.jpg inflating: dataset/train/racoon/0057aef51d69fd8bff.jpg inflating: dataset/train/racoon/00a4c3339602e247f1.jpeg inflating: dataset/train/racoon/00d747b028e5cf7a62.png inflating: dataset/train/racoon/016428ff5345f7cbf8.jpg inflating: dataset/train/racoon/032b0741aad287ba2e.jpg inflating: dataset/train/racoon/036da25423a06f9ec0.jpg inflating: dataset/train/racoon/038802ef2ad8216e41.jpg inflating: dataset/train/racoon/043256c8d25ccd7529.jpg inflating: dataset/train/racoon/04552c774ff99a7a6b.jpg inflating: dataset/train/racoon/04b51ea9ab5d995f1f.jpg inflating: dataset/train/racoon/0541ceb58c30c4c20e.jpg inflating: dataset/train/racoon/059d7b9a96abc9184d.jpg inflating: dataset/train/racoon/06465b810316842423.jpg inflating: dataset/train/racoon/0729ad08650ccde1ff.jpg inflating: dataset/train/racoon/0871beb33e91cb696f.jpeg inflating: dataset/train/racoon/089f20f6b9d9bb897b.jpg inflating: dataset/train/racoon/08afb8eb8b52237a12.JPG inflating: dataset/train/racoon/09330b94c1e781efb8.jpg inflating: dataset/train/racoon/0a05c34a3d6677b623.jpg inflating: dataset/train/racoon/0a624aa01ba7b3b034.jpg inflating: dataset/train/racoon/0ade72b7ca85c0d047.jpeg inflating: dataset/train/racoon/0b2f65841f1faa5a8c.png inflating: dataset/train/racoon/0c3fb04b2d9ea90d8e.png inflating: dataset/train/racoon/0c8eedac923b106d4b.jpg inflating: dataset/train/racoon/0d03b32bfbe6d7bdf9.jpg inflating: dataset/train/racoon/0d47d1c9c119739531.jpg inflating: dataset/train/racoon/0db88744f6ed74ef41.jpg inflating: dataset/train/racoon/0dc4da65338d3b9fb2.jpg inflating: dataset/train/racoon/0e18a7291235f4a18a.jpg inflating: dataset/train/racoon/0e257c7d1772e3f455.jpg inflating: dataset/train/racoon/0e4033cc0983a0b805.jpg inflating: dataset/train/racoon/0f4ed4f42d7bc39798.jpg inflating: dataset/train/racoon/0fbabb1414fc375c93.jpg inflating: dataset/train/racoon/1029056cc383531a01.jpg inflating: dataset/train/racoon/10add5bd7b07650196.jpg inflating: dataset/train/racoon/10f4569550de044bda.jpg inflating: dataset/train/racoon/119c0b940bc72a17d2.jpg inflating: dataset/train/racoon/12227e748f611c7872.jpg inflating: dataset/train/racoon/13ced13fad54b7adf4.jpg inflating: dataset/train/racoon/1491244d3481cab5eb.jpg inflating: dataset/train/racoon/15721014cd4538adec.jpg inflating: dataset/train/racoon/1693454abe433083cb.jpg inflating: dataset/train/racoon/16a5d7f9dada4e6a97.JPG inflating: dataset/train/racoon/17f58e3fc098c5ced5.jpg inflating: dataset/train/racoon/1874ea06d096ae2ca6.jpg inflating: dataset/train/racoon/18d56bdaa36353ee58.jpg inflating: dataset/train/racoon/19c3b69eb5ae8f61b4.jpg inflating: dataset/train/racoon/1b097743dd6c52ba8b.jpg inflating: dataset/train/racoon/1b167ea9175094f517.JPG inflating: dataset/train/racoon/1b94dd24dd32a2f8dd.jpg inflating: dataset/train/racoon/1da31acf1c44a23f7f.jpg inflating: dataset/train/racoon/2053f2091d632c3db5.jpg inflating: dataset/train/racoon/217c4682133ba77364.jpg inflating: dataset/train/racoon/21a3a9456f4dfd31a2.jpg inflating: dataset/train/racoon/21d953b2990cb8fe18.jpg inflating: dataset/train/racoon/22708c14021a1f7371.jpeg inflating: dataset/train/racoon/23b93be14715cced8a.jpg inflating: dataset/train/racoon/23c8e449ed4b9e98b2.jpg inflating: dataset/train/racoon/23d4174b83aa8d8af5.jpg inflating: dataset/train/racoon/23e6c8ca631dd70b44.jpg inflating: dataset/train/racoon/23f535e81d1c4b8e50.jpg inflating: dataset/train/racoon/246913ad70cdf1a772.jpg inflating: dataset/train/racoon/2524481642b1b9d24a.jpg inflating: dataset/train/racoon/25254b96428d5b85c3.jpg inflating: dataset/train/racoon/260ff885159a116c2e.jpg inflating: dataset/train/racoon/26191a158acaf26806.jpg inflating: dataset/train/racoon/262bfa664fcc60dc54.jpg inflating: dataset/train/racoon/265f2e79c6f92e481a.jpg inflating: dataset/train/racoon/2685b404967fbd31e7.jpg inflating: dataset/train/racoon/26d4d6fc5ab37c5e04.jpg inflating: dataset/train/racoon/290a39ea91901139cd.jpg inflating: dataset/train/racoon/29316476e9dfdc2b90.png inflating: dataset/train/racoon/29f056e104c5ecdcdb.png inflating: dataset/train/racoon/2a3bd9afc719e2051b.jpg inflating: dataset/train/racoon/2ac7ff4c1006571470.jpg inflating: dataset/train/racoon/2b055c2f9c8c15d3d5.jpg inflating: dataset/train/racoon/2b1bdb30b8a48f2338.jpg inflating: dataset/train/racoon/2c009b5c34ccb68ce3.jpg inflating: dataset/train/racoon/2c1c235a1710af36f6.jpg inflating: dataset/train/racoon/2c61d0ca47d031d8bc.jpg inflating: dataset/train/racoon/2d078a8e44e728eecc.jpg inflating: dataset/train/racoon/2d2a7d870455d7acd6.jpg inflating: dataset/train/racoon/2d9b0ee6e01d0f9cfe.jpg inflating: dataset/train/racoon/2e67348fe3afeaa175.jpg inflating: dataset/train/racoon/2f934aabd76e98459a.jpg inflating: dataset/train/racoon/2fa1092e0588bfb5b4.jpg inflating: dataset/train/racoon/2fc93309317af593de.jpg inflating: dataset/train/racoon/307eddc455306d0de1.jpg inflating: dataset/train/racoon/319b141502561942ea.jpg inflating: dataset/train/racoon/34b91b379fd2e1a026.jpg inflating: dataset/train/racoon/35c37346524526bdb0.jpg inflating: dataset/train/racoon/36056eab2ba97ccf8a.jpg inflating: dataset/train/racoon/3609de641e1659b6e2.jpg inflating: dataset/train/racoon/381b8d3d5b7f6598fb.png inflating: dataset/train/racoon/383ebd2678c0cfc84d.jpg inflating: dataset/train/racoon/38a870ad4485ce3f6d.jpeg inflating: dataset/train/racoon/395618fc7f4e9fa75d.jpg inflating: dataset/train/racoon/39577bb0003ba18543.jpg inflating: dataset/train/racoon/3a0fdbf41488b404fc.jpg inflating: dataset/train/racoon/3a12aa180e8a10703f.jpg inflating: dataset/train/racoon/3aaaa4eb85ca863a7d.jpg inflating: dataset/train/racoon/3b3804a12b641e3a0d.jpg inflating: dataset/train/racoon/3bfe852d8e292c343f.jpg inflating: dataset/train/racoon/3c184a0d2e3f94535a.JPG inflating: dataset/train/racoon/3d007ed2253a84c966.jpg inflating: dataset/train/racoon/3d0ffc54b0c44ab2bd.jpg inflating: dataset/train/racoon/3d93b05ebdeab7900e.jpg inflating: dataset/train/racoon/3dc6c397fb7e88bb44.jpg inflating: dataset/train/racoon/3dde7d804d6db28e2f.jpg inflating: dataset/train/racoon/3e64c3b3e28405e64b.jpg inflating: dataset/train/racoon/3f2617f7bf7a078e3a.jpg inflating: dataset/train/racoon/3f9d9d7c605f5a9b9c.jpg inflating: dataset/train/racoon/400f120535fbbaf61a.jpg inflating: dataset/train/racoon/403453a40250e55af0.jpg inflating: dataset/train/racoon/4069be3809537b0d2a.jpg inflating: dataset/train/racoon/407ba9a7dc0f8b0634.jpg inflating: dataset/train/racoon/4143ab131b01121612.jpg inflating: dataset/train/racoon/420c99aecf830cb51d.jpg inflating: dataset/train/racoon/4225c350ef207df9a0.jpg inflating: dataset/train/racoon/4239c81b7278eaa8f1.jpg inflating: dataset/train/racoon/434a7590f5ab126e35.JPG inflating: dataset/train/racoon/43e51f269008456e04.jpg inflating: dataset/train/racoon/44266b39af34268208.jpg inflating: dataset/train/racoon/44745254086856f8b7.jpg inflating: dataset/train/racoon/44bb26b7b519677712.jpg inflating: dataset/train/racoon/44e36eb5815989d6da.jpg inflating: dataset/train/racoon/450b59337d5f69f917.jpg inflating: dataset/train/racoon/4543461c32dbb7705d.jpg inflating: dataset/train/racoon/4688356c03aba3e35a.jpg inflating: dataset/train/racoon/4698711c315759a9bf.jpg inflating: dataset/train/racoon/4708d45b7163be8acf.jpg inflating: dataset/train/racoon/4897aec65997cf7460.jpg inflating: dataset/train/racoon/48b473628fafcbddd3.jpg inflating: dataset/train/racoon/48f101d22a91a01f77.jpg inflating: dataset/train/racoon/49dac94d873381eac8.jpg inflating: dataset/train/racoon/49fb2e9cede7cda8a0.jpg inflating: dataset/train/racoon/4a4f5163a1538366ed.jpg inflating: dataset/train/racoon/4a9697edf76ff9a832.jpg inflating: dataset/train/racoon/4ad187d3ba57f85660.jpg inflating: dataset/train/racoon/4afc4c399e3eb2e377.jpg inflating: dataset/train/racoon/4bddc043bc5d64cc49.jpg inflating: dataset/train/racoon/4bf37da22424f85cda.jpg inflating: dataset/train/racoon/4c4b8344b6771ddfda.jpg inflating: dataset/train/racoon/4e191705795610874c.jpg inflating: dataset/train/racoon/4f38f9f2a0a1fb788d.jpg inflating: dataset/train/racoon/4f52626c36fe288eb7.jpg inflating: dataset/train/racoon/523e281177cd5e0645.jpg inflating: dataset/train/racoon/528b00ac4f04ebe4e9.jpg inflating: dataset/train/racoon/52a65a4148e386bbba.jpg inflating: dataset/train/racoon/53079bb385753c7b74.jpg inflating: dataset/train/racoon/535f13147558cb43ef.jpg inflating: dataset/train/racoon/5380291cf5dee9e5f9.JPG inflating: dataset/train/racoon/53ee80b3f790035b56.jpg inflating: dataset/train/racoon/555617a96090517500.jpg inflating: dataset/train/racoon/55734734d6eb9ef101.jpg inflating: dataset/train/racoon/55a60bbcbde7c70874.JPG inflating: dataset/train/racoon/5614e5aae0e814c0d1.jpg inflating: dataset/train/racoon/568a6cdad5aeb0f10e.jpg inflating: dataset/train/racoon/5785bcd81d1820c5f2.jpg inflating: dataset/train/racoon/57bce484d4daee2cbe.jpg inflating: dataset/train/racoon/57ec5311278660a958.jpg inflating: dataset/train/racoon/58666802bad997537c.jpg inflating: dataset/train/racoon/5880c68a29660909ef.jpg inflating: dataset/train/racoon/589d0dcb45ea8aa62c.jpg inflating: dataset/train/racoon/598bf4975368fd8055.jpg inflating: dataset/train/racoon/59fed8441ae605cf17.jpg inflating: dataset/train/racoon/5aa965a17ac34d3e85.jpg inflating: dataset/train/racoon/5b2a82528a2888e936.jpg inflating: dataset/train/racoon/5b63e3c641b880c77c.jpg inflating: dataset/train/racoon/5bcc356a5d3184bf5e.jpg inflating: dataset/train/racoon/5cb8e51fc602db2b53.jpg inflating: dataset/train/racoon/5ce9647928b40f7244.jpg inflating: dataset/train/racoon/5e02855c21e8f1804c.png inflating: dataset/train/racoon/5ed82b2163f6fdc2ec.jpg inflating: dataset/train/racoon/5f304685feb795ec5c.jpg inflating: dataset/train/racoon/5fab24fc4a8a387f42.png inflating: dataset/train/racoon/5fd5484e6f12c2f6ad.jpg inflating: dataset/train/racoon/60694739be59689c83.jpg inflating: dataset/train/racoon/6100f39924d7d22546.JPG inflating: dataset/train/racoon/62ae7c10c548ac84a0.jpg inflating: dataset/train/racoon/62f80df492b135ad69.jpg inflating: dataset/train/racoon/633b009ea133369dc6.jpg inflating: dataset/train/racoon/63509f416b84c15802.JPG inflating: dataset/train/racoon/635c505b48aca62f92.jpg inflating: dataset/train/racoon/6376487d21feb80ae2.jpg inflating: dataset/train/racoon/63e0d8b6296af32db1.JPG inflating: dataset/train/racoon/644d893a90807f0a78.jpeg inflating: dataset/train/racoon/648425a273ccc7f7aa.jpg inflating: dataset/train/racoon/648d0d4d0c54f04e00.jpg inflating: dataset/train/racoon/64e526ba353d0eb514.jpg inflating: dataset/train/racoon/65ddb68c3a3b2304e8.jpg inflating: dataset/train/racoon/664876adcb93f05a9f.jpg inflating: dataset/train/racoon/66bd60c95527381dd5.JPG inflating: dataset/train/racoon/66e0e05b32f5fac8d5.jpg inflating: dataset/train/racoon/66e2841e822744e88c.jpg inflating: dataset/train/racoon/673971d8a48b5eaf91.jpg inflating: dataset/train/racoon/676bc9ec79af2419ef.jpg inflating: dataset/train/racoon/6792e0e652d4ad9629.jpg inflating: dataset/train/racoon/68062ab3f85e611eb2.jpg inflating: dataset/train/racoon/6870e40e793a10e57b.jpg inflating: dataset/train/racoon/68fa6c635a62eda1e7.jpg inflating: dataset/train/racoon/6925ae4453ff2c52e4.jpg inflating: dataset/train/racoon/692d409e5fa10f80a8.jpg inflating: dataset/train/racoon/69a546833cbc679949.jpg inflating: dataset/train/racoon/6a1ef3d7f4d8b98d58.jpg inflating: dataset/train/racoon/6a9d320001c896e6aa.jpeg inflating: dataset/train/racoon/6b064f642c2b422778.jpg inflating: dataset/train/racoon/6cd6b9b15bb724cc9f.jpg inflating: dataset/train/racoon/6df312cafca30480bb.jpg inflating: dataset/train/racoon/6e34efa7a52690342c.jpg inflating: dataset/train/racoon/6e3907280be9395a1f.jpg inflating: dataset/train/racoon/6ef79d51840ac6948a.jpg inflating: dataset/train/racoon/6f630c975cb7aa0695.JPG inflating: dataset/train/racoon/6f7a48a74e71ede114.jpg inflating: dataset/train/racoon/6f8e1431b7df1acf1e.png inflating: dataset/train/racoon/6fc58b0b616f7c4d68.jpg inflating: dataset/train/racoon/702459a79bd1fe0a75.jpg inflating: dataset/train/racoon/7094a0c5040f5b1aab.jpg inflating: dataset/train/racoon/70bced4750aca6b3a1.jpg inflating: dataset/train/racoon/70d93ae03e51722105.jpg inflating: dataset/train/racoon/7176f91d22eaffe1f5.jpg inflating: dataset/train/racoon/726883b4b6dd6c437d.png inflating: dataset/train/racoon/726d0cbab270768994.jpg inflating: dataset/train/racoon/72bec7267ed362bc3e.jpg inflating: dataset/train/racoon/72e16d513ad9059b73.jpg inflating: dataset/train/racoon/731e3e935c00e67616.png inflating: dataset/train/racoon/7371fb4d9d0e4a3a2c.jpg inflating: dataset/train/racoon/746e3f19f864b25f81.jpg inflating: dataset/train/racoon/7514c1ad5089f1ebee.jpg inflating: dataset/train/racoon/758758a77c875ef81d.jpg inflating: dataset/train/racoon/78dd24b077e5129f53.jpg inflating: dataset/train/racoon/794916e20aac356662.jpg inflating: dataset/train/racoon/797a85bb3676c5b21d.jpg inflating: dataset/train/racoon/7a6363824a88250b6d.jpg inflating: dataset/train/racoon/7b240c0c13e4198185.gif inflating: dataset/train/racoon/7b47b69925ea183e9c.jpg inflating: dataset/train/racoon/7b582f2081bbc350e3.jpg inflating: dataset/train/racoon/7b828c667d1904bc26.jpg inflating: dataset/train/racoon/7bc852d98f7731b302.jpg inflating: dataset/train/racoon/7cd3e15ad48a3df157.jpg inflating: dataset/train/racoon/7d0cec81b527d39c09.jpg inflating: dataset/train/racoon/7ea96ca0ddbbaa17d5.jpg inflating: dataset/train/racoon/7ee3766bf7138f4545.jpg inflating: dataset/train/racoon/7f7ec5efa93e92f8a2.jpg inflating: dataset/train/racoon/7fc87394421e302d47.jpg inflating: dataset/train/racoon/820f590534b1038b0a.jpg inflating: dataset/train/racoon/825c4bb2f916d79d8c.JPG inflating: dataset/train/racoon/82b2a4dfe9422a70ae.jpg inflating: dataset/train/racoon/83b28605750b2610b3.jpg inflating: dataset/train/racoon/841e3343772ebdf466.jpg inflating: dataset/train/racoon/84b654eca3980debd4.jpg inflating: dataset/train/racoon/84d4a8365cb3ca6966.jpg inflating: dataset/train/racoon/8510b744d3f90cfbc2.jpg inflating: dataset/train/racoon/856ce73a4d39ec67e9.jpg inflating: dataset/train/racoon/863b6d531fef5446cf.jpg inflating: dataset/train/racoon/864f2f6b19a3292ceb.jpg inflating: dataset/train/racoon/87bc661cbe14be0d71.jpg inflating: dataset/train/racoon/889f5e75c3257006cb.jpg inflating: dataset/train/racoon/88e522fb3fceec2c77.jpg inflating: dataset/train/racoon/897ccbec3f94497d77.jpg inflating: dataset/train/racoon/8a040869a278247b24.jpg inflating: dataset/train/racoon/8a3a18cc30f4852e75.jpg inflating: dataset/train/racoon/8b136bcd084621307f.jpg inflating: dataset/train/racoon/8b7324a3bbf8e4fa06.jpg inflating: dataset/train/racoon/8b8f0e9639b0e4b370.jpeg inflating: dataset/train/racoon/8bbbe6136263db24ab.jpg inflating: dataset/train/racoon/8ca8f1445ac20797e0.jpeg inflating: dataset/train/racoon/8cb0c1d7f7032eed29.jpg inflating: dataset/train/racoon/8cec09929327e6a7f7.jpg inflating: dataset/train/racoon/8e15d4e9a1b818f51a.jpg inflating: dataset/train/racoon/8e4c4360b49c716ef4.jpg inflating: dataset/train/racoon/8eeef794e86a5ceeef.jpg inflating: dataset/train/racoon/8f923422dfd30480ad.jpg inflating: dataset/train/racoon/8fd14ac205518e3539.JPG inflating: dataset/train/racoon/9178334bcbcddd8b37.jpg inflating: dataset/train/racoon/91959325ace3089ba9.jpg inflating: dataset/train/racoon/91a38c4ed9cef5a6e9.jpg inflating: dataset/train/racoon/91f7a8d35612f80a02.jpg inflating: dataset/train/racoon/924c9540a6e0263ea1.jpg inflating: dataset/train/racoon/93323d8858f1b3842d.png inflating: dataset/train/racoon/9334dba978841fbc7b.jpg inflating: dataset/train/racoon/93fc1576589a191726.JPG inflating: dataset/train/racoon/9481414fdfbf33ab4c.png inflating: dataset/train/racoon/94ca9441cad8400a5b.jpg inflating: dataset/train/racoon/94e7162e8b1caacf57.jpg inflating: dataset/train/racoon/94f4a55c53e0c87d75.jpg inflating: dataset/train/racoon/952fce42f118e62c28.jpg inflating: dataset/train/racoon/953fb6727e4897d5ef.png inflating: dataset/train/racoon/9569b122c912ff6e1b.jpg inflating: dataset/train/racoon/95f5c25558a311e1e9.jpg inflating: dataset/train/racoon/9654d6e0292afb44d7.jpg inflating: dataset/train/racoon/965de3f5adb9de998c.jpg inflating: dataset/train/racoon/97b4ea5df2aa219f96.jpg inflating: dataset/train/racoon/9828399b712e3f205f.jpg inflating: dataset/train/racoon/98e6461d08ac9d5f05.png inflating: dataset/train/racoon/9947a3458f96bdd58a.jpg inflating: dataset/train/racoon/9972ab2ea75bcf5e0f.jpg inflating: dataset/train/racoon/99aa4d53f7918b82b1.png inflating: dataset/train/racoon/9a481bf12e6b5a594d.jpg inflating: dataset/train/racoon/9a7d3979fbdfd28219.jpg inflating: dataset/train/racoon/9a87af5ec498026b5c.jpg inflating: dataset/train/racoon/9b0cb1bad842e2abfe.jpg inflating: dataset/train/racoon/9b3a946884d5d0e29d.jpg inflating: dataset/train/racoon/9c71955d15d3499621.jpg inflating: dataset/train/racoon/9c9c0cc3b1228f1527.jpg inflating: dataset/train/racoon/9cd236330a7c7f2f7e.jpg inflating: dataset/train/racoon/9d4ec68e6dc372d97c.jpg inflating: dataset/train/racoon/9dd1b4ef2f91c510bf.jpg inflating: dataset/train/racoon/9de22c616d82c65d3b.png inflating: dataset/train/racoon/9e306849b3e60e0cf9.jpg inflating: dataset/train/racoon/9f1eb31c3ed65585dc.jpg inflating: dataset/train/racoon/9f3674bec1a3ee58e2.jpg inflating: dataset/train/racoon/9f3fae7c7a766d5176.jpg inflating: dataset/train/racoon/9fa4056c84ab229ad1.jpg inflating: dataset/train/racoon/a0652dae629533c580.jpg inflating: dataset/train/racoon/a0a42b86f5efaed171.jpg inflating: dataset/train/racoon/a14c50a58b55c75441.jpg inflating: dataset/train/racoon/a171b777dce13f161f.jpg inflating: dataset/train/racoon/a3d96b8130d0639ea8.jpg inflating: dataset/train/racoon/a438fa85af4599c2c0.jpg inflating: dataset/train/racoon/a53bd1f4b8a9f551a9.jpg inflating: dataset/train/racoon/a5d87589eaa41efee9.jpg inflating: dataset/train/racoon/a5f4781d6684f3f864.jpg inflating: dataset/train/racoon/a6a922c80442369003.jpg inflating: dataset/train/racoon/a6c96ced6bb0e6b180.jpg inflating: dataset/train/racoon/a72283c4583be1f323.jpg inflating: dataset/train/racoon/a7522a2576b933a476.jpg inflating: dataset/train/racoon/a7c9214c8cc44c56c2.jpg inflating: dataset/train/racoon/a8b0a042a0be31fac7.jpg inflating: dataset/train/racoon/a9b5b0d72c018e3b6f.jpg inflating: dataset/train/racoon/aa685954007b637048.jpg inflating: dataset/train/racoon/ab42c9329d68c1ee2c.jpg inflating: dataset/train/racoon/ab4a3150e3dfb1e869.jpg inflating: dataset/train/racoon/aba686d7828ecfea01.jpg inflating: dataset/train/racoon/abaeb1a85370bd1764.jpg inflating: dataset/train/racoon/abdc7caf83e92b1d6e.jpg inflating: dataset/train/racoon/ac20534b48e9c2b455.jpg inflating: dataset/train/racoon/ac692cb9c2f14d2f24.jpg inflating: dataset/train/racoon/acae304a624edd087e.jpg inflating: dataset/train/racoon/acd1d110fdb92297f0.jpg inflating: dataset/train/racoon/acf65815f3aa10c86c.jpg inflating: dataset/train/racoon/ad4fb2c6853dd2a4fa.jpg inflating: dataset/train/racoon/adb23a180fd2af7de7.jpg inflating: dataset/train/racoon/adc668f2046bae6254.jpg inflating: dataset/train/racoon/adcdac6246c20c140d.jpg inflating: dataset/train/racoon/ae0746c6f7c69192b9.jpg inflating: dataset/train/racoon/ae75db253491d3a323.jpg inflating: dataset/train/racoon/afacd9d31032669dbe.jpg inflating: dataset/train/racoon/afedd7a07d20a47b71.jpg inflating: dataset/train/racoon/b00a1ff7223bc35dad.jpg inflating: dataset/train/racoon/b07d554783cbab8e47.jpg inflating: dataset/train/racoon/b090a6293bcb45a9de.jpg inflating: dataset/train/racoon/b1105db6f5b9ec448b.png inflating: dataset/train/racoon/b16d3cc14a1acb446f.jpg inflating: dataset/train/racoon/b1bf812f980634ce57.jpg inflating: dataset/train/racoon/b2a2007e6d43c23043.jpg inflating: dataset/train/racoon/b2c6d86193dfb7a838.jpg inflating: dataset/train/racoon/b3c351793a29000a45.jpg inflating: dataset/train/racoon/b437b0dbe2d9f7d93d.jpeg inflating: dataset/train/racoon/b4da1d862fbe6c329f.jpg inflating: dataset/train/racoon/b60320a583a1a01ef5.jpg inflating: dataset/train/racoon/b6b2e30ed9a51c5532.jpg inflating: dataset/train/racoon/b6db2a8e5cdbf66216.jpg inflating: dataset/train/racoon/b744ee96d98cfc4bd9.jpg inflating: dataset/train/racoon/b7858416033c1d11fa.jpg inflating: dataset/train/racoon/b85b1fe36e6affea47.jpg inflating: dataset/train/racoon/b923af93f42914c186.jpg inflating: dataset/train/racoon/b9d2cb2356bff39d3c.jpg inflating: dataset/train/racoon/b9dce0178a03b8c8fb.jpg inflating: dataset/train/racoon/b9f8244eaea6bc198b.jpg inflating: dataset/train/racoon/ba33a2f87d575fb84d.jpg inflating: dataset/train/racoon/bae73e6799003a673a.jpg inflating: dataset/train/racoon/bc34a246019a3b0669.jpg inflating: dataset/train/racoon/bc3a049b434edf992c.jpg inflating: dataset/train/racoon/bc5b08f7bc5dd8208d.JPG inflating: dataset/train/racoon/bcd14adc9f925e1793.jpg inflating: dataset/train/racoon/bceb66e80dd767e405.jpg inflating: dataset/train/racoon/bcf0ec2b94d5a4ffcb.jpg inflating: dataset/train/racoon/bdd840747c20eb227c.jpg inflating: dataset/train/racoon/bde3f32687d243b7ee.jpg inflating: dataset/train/racoon/be560eeb92db7c53c5.jpg inflating: dataset/train/racoon/bedb455559bf7897ab.jpg inflating: dataset/train/racoon/bf2a9ba42d9005fc79.jpg inflating: dataset/train/racoon/c0da684515790b6428.jpg inflating: dataset/train/racoon/c1614196c76403e8f7.jpg inflating: dataset/train/racoon/c1f3637d40818457bf.jpg inflating: dataset/train/racoon/c1f8d503f07b36c493.jpg inflating: dataset/train/racoon/c1ff655e446be05a07.jpg inflating: dataset/train/racoon/c231d31401b7f1abd5.jpg inflating: dataset/train/racoon/c23cbe480b1976dd9c.jpg inflating: dataset/train/racoon/c2bdcfcbeb88b0582e.jpg inflating: dataset/train/racoon/c2c5c935dfb7e46334.jpg inflating: dataset/train/racoon/c38bdfe153bc8f6567.jpg inflating: dataset/train/racoon/c3a5e7ba044ede901d.jpg inflating: dataset/train/racoon/c3baa39b453735b5ff.jpg inflating: dataset/train/racoon/c4f628363e9e65a484.jpg inflating: dataset/train/racoon/c55b16a5e5a7acc256.jpg inflating: dataset/train/racoon/c5d15a19a2f670dac7.jpg inflating: dataset/train/racoon/c62c92ba1ddde9b189.jpg inflating: dataset/train/racoon/c63e9d80a0eca4546c.png inflating: dataset/train/racoon/c64d496014f5cda74e.jpg inflating: dataset/train/racoon/c823b9e1e4016adb50.jpg inflating: dataset/train/racoon/c88f1aaa95f760c7e8.jpg inflating: dataset/train/racoon/c9ccb025df95a84de4.jpg inflating: dataset/train/racoon/ca296cef08364a4748.jpg inflating: dataset/train/racoon/cad49e3a57a0339a99.jpg inflating: dataset/train/racoon/cbb1cc032dd94921b8.jpg inflating: dataset/train/racoon/cbc0caef3e4180b471.jpg inflating: dataset/train/racoon/cbfa07bc5475639f05.jpg inflating: dataset/train/racoon/cc551735bdfac51e78.jpg inflating: dataset/train/racoon/cc762572558a70cb88.jpg inflating: dataset/train/racoon/ccec3acefd2159409c.jpg inflating: dataset/train/racoon/cd63f8738302a9ea3b.jpg inflating: dataset/train/racoon/cda03c6bac08e144d5.jpeg inflating: dataset/train/racoon/ce3c8c5e84c8d54557.jpg inflating: dataset/train/racoon/cfd90b7628322ba629.jpeg inflating: dataset/train/racoon/cfe1722ab89e88bfee.jpg inflating: dataset/train/racoon/d0c15292fa6ce28409.jpg inflating: dataset/train/racoon/d1cf9192df25e9864e.jpg inflating: dataset/train/racoon/d1d668ccc79b2e0846.jpg inflating: dataset/train/racoon/d2f190fc27fe0034a3.jpg inflating: dataset/train/racoon/d3249f00ca840b0443.jpg inflating: dataset/train/racoon/d37132245010225d0f.jpg inflating: dataset/train/racoon/d3be326edd82fbf27a.jpg inflating: dataset/train/racoon/d401e939204bdf9dc6.jpg inflating: dataset/train/racoon/d42020e5855ba8c57f.jpg inflating: dataset/train/racoon/d422949aa4b761ed57.jpg inflating: dataset/train/racoon/d45317945ac771c6a6.jpg inflating: dataset/train/racoon/d4dc78a8bb137ca01a.jpg inflating: dataset/train/racoon/d51875ab52dc77d231.jpg inflating: dataset/train/racoon/d637b0ea5e68737bac.jpg inflating: dataset/train/racoon/d6a1ecdab637e915e0.jpg inflating: dataset/train/racoon/d70d59022e78369a26.jpg inflating: dataset/train/racoon/d7f4c269c35b04c210.jpg inflating: dataset/train/racoon/d9029f038ab1744552.jpg inflating: dataset/train/racoon/d91b8a66418e0ce6f1.jpg inflating: dataset/train/racoon/da9abde08bae0b9aa4.jpg inflating: dataset/train/racoon/db6a13000629856c94.jpg inflating: dataset/train/racoon/dc55afd38a26354c3d.jpg inflating: dataset/train/racoon/dc6b88beba90d74d74.jpg inflating: dataset/train/racoon/dd07e15d907f131cec.jpg inflating: dataset/train/racoon/dd6f0a088365966a4b.jpg inflating: dataset/train/racoon/ddc00b4ce1d6767513.jpg inflating: dataset/train/racoon/dde066f8667ff2c93e.jpg inflating: dataset/train/racoon/de2220a98e60a75c6d.jpg inflating: dataset/train/racoon/de931d154580c44f16.jpg inflating: dataset/train/racoon/df9f88f35bb0effca5.jpg inflating: dataset/train/racoon/e0b42890db8d425ba0.jpg inflating: dataset/train/racoon/e154caa1bea3de28da.jpg inflating: dataset/train/racoon/e199e1463cf0d8245c.jpg inflating: dataset/train/racoon/e19fef4b3f181677ff.jpg inflating: dataset/train/racoon/e1d2bde888b024945c.jpg inflating: dataset/train/racoon/e2841477033f0836f1.jpg inflating: dataset/train/racoon/e2d41f9e11afc0e0f4.gif inflating: dataset/train/racoon/e37a9a0e9fe39e452c.jpg inflating: dataset/train/racoon/e3c5a2f24738c47ec2.jpg inflating: dataset/train/racoon/e406d824e703005f8f.jpg inflating: dataset/train/racoon/e4bebe600167a87dd0.jpg inflating: dataset/train/racoon/e6094568cf8da9e281.jpg inflating: dataset/train/racoon/e621a883e51d53119a.jpg inflating: dataset/train/racoon/e63c4b590715980931.jpg inflating: dataset/train/racoon/e6b05fed2a2d67bf8c.jpg inflating: dataset/train/racoon/e7bfa88b3f8f001e2c.jpg inflating: dataset/train/racoon/e9080045da8f9c4643.jpg inflating: dataset/train/racoon/e91cb32fce4c2abcee.jpg inflating: dataset/train/racoon/e990054cd5edd28086.jpg inflating: dataset/train/racoon/e9b1900532488a13c6.jpg inflating: dataset/train/racoon/e9f7ccaeceb1421f47.jpg inflating: dataset/train/racoon/ea18ec5c7fb5c0baef.jpg inflating: dataset/train/racoon/ea196e288e7ea2b611.jpg inflating: dataset/train/racoon/ead0ec4c000558fee7.jpg inflating: dataset/train/racoon/ebb143340268b40d73.jpg inflating: dataset/train/racoon/ebc32f31b0b37f7393.jpg inflating: dataset/train/racoon/ebd6c2a4eafaaa9016.jpg inflating: dataset/train/racoon/ebf15648c1a86b24c1.jpg inflating: dataset/train/racoon/ec31bf026b34761d9e.jpg inflating: dataset/train/racoon/eccabb83234e1d0282.jpg inflating: dataset/train/racoon/ed32cbcd3de2e9e283.jpg inflating: dataset/train/racoon/ede8d249f904413504.jpg inflating: dataset/train/racoon/eefef0cc6421ce03ac.jpg inflating: dataset/train/racoon/f10becdeef97e10b20.jpg inflating: dataset/train/racoon/f266f5051936b7ce85.jpg inflating: dataset/train/racoon/f3383948afaea2ba35.jpg inflating: dataset/train/racoon/f3ad0ab534e5940ae5.jpg inflating: dataset/train/racoon/f40ba9613c3b8c3010.jpg inflating: dataset/train/racoon/f46987803f55122bee.jpg inflating: dataset/train/racoon/f5e36fa8c8eb2eb668.jpg inflating: dataset/train/racoon/f608c229965e6716d6.jpg inflating: dataset/train/racoon/f698e0eaa8efd0e929.png inflating: dataset/train/racoon/f6d01042f7ac64bd3d.jpg inflating: dataset/train/racoon/f6dc56075366164ff9.jpg inflating: dataset/train/racoon/f6fd9bce49e4b5fb1c.jpg inflating: dataset/train/racoon/f7356809c37157402c.jpg inflating: dataset/train/racoon/f7d1137562f0aca1f2.jpg inflating: dataset/train/racoon/f88d46f1768de2ae1a.jpg inflating: dataset/train/racoon/f907f82dba1c5f3469.jpg inflating: dataset/train/racoon/f97c8abc0ef3293a74.jpg inflating: dataset/train/racoon/f97fd5c41821c324d4.jpg inflating: dataset/train/racoon/f9e8d19b34fd39075c.jpg inflating: dataset/train/racoon/fa5f6f45217ee046e5.jpg inflating: dataset/train/racoon/fc5773388be13cb096.jpg inflating: dataset/train/racoon/fc7e157206d2efb73e.jpeg inflating: dataset/train/racoon/fcd38831b658764fe6.png inflating: dataset/train/racoon/fcf6badad016219dcc.jpg inflating: dataset/train/racoon/fe22457811a8fcecdb.jpg inflating: dataset/train/racoon/ff26d4f5760f0e5899.jpg inflating: dataset/train/racoon/ff2cf8e59804a8165c.png inflating: dataset/train/racoon/ff3f90797f846172e1.jpg creating: dataset/train/raven/ inflating: dataset/train/raven/005364e6874d98a886.jpg inflating: dataset/train/raven/00e0f529d5094f2116.PNG inflating: dataset/train/raven/018472859e5005d99d.jpg inflating: dataset/train/raven/01e1275646462e2713.jpg inflating: dataset/train/raven/02372725e6f1694eea.jpg inflating: dataset/train/raven/02b2464d61cec045e0.jpg inflating: dataset/train/raven/02d4d944ae416e66aa.png inflating: dataset/train/raven/03d8f95dd47dbce24b.jpg inflating: dataset/train/raven/03dbeb770b3dbe7bbf.jpg inflating: dataset/train/raven/03fb3e563978c1bd13.jpg inflating: dataset/train/raven/03fe372fd738c5be95.jpg inflating: dataset/train/raven/040cccb0299528d42d.jpg inflating: dataset/train/raven/0521ae44dc66c36cbe.jpg inflating: dataset/train/raven/0597fd010e4bd2d332.jpg inflating: dataset/train/raven/05ac21894c548c5de3.jpg inflating: dataset/train/raven/05c41bcbcf47cdfa4b.jpg inflating: dataset/train/raven/05f8e1028c0912887f.jpg inflating: dataset/train/raven/062f32eed3c90022ad.jpg inflating: dataset/train/raven/0636e26ef13128bdf1.jpg inflating: dataset/train/raven/06dcecb0f7c3288380.jpg inflating: dataset/train/raven/0782bfefffe0914473.jpg inflating: dataset/train/raven/0917e589a711c2790b.jpg inflating: dataset/train/raven/0923baa96dd05c63d5.jpg inflating: dataset/train/raven/093a31583e514d7fca.jpg inflating: dataset/train/raven/093dba5fa7a3e56f7a.jpg inflating: dataset/train/raven/0a372cd4d909f75a27.jpg inflating: dataset/train/raven/0a531fd28267a891eb.jpg inflating: dataset/train/raven/0a8daead0dabb910d3.jpg inflating: dataset/train/raven/0b2dff18ffee91fc14.jpg inflating: dataset/train/raven/0fa533de571cfa80c5.jpg inflating: dataset/train/raven/0fa8f63651475abaff.jpg inflating: dataset/train/raven/101fd88a20fffd9cad.jpg inflating: dataset/train/raven/10a1d165acc015011c.jpg inflating: dataset/train/raven/11408db681bb29cc79.jpg inflating: dataset/train/raven/117774c25fc030b416.jpg inflating: dataset/train/raven/120eb0af79290b58b2.jpg inflating: dataset/train/raven/121c715bc0ffc31c9a.jpg inflating: dataset/train/raven/1368474a31244a9dd3.jpg inflating: dataset/train/raven/13a84b42e0606b6a0a.jpg inflating: dataset/train/raven/14032482786f834866.jpg inflating: dataset/train/raven/1497f1b7718bba1596.jpg inflating: dataset/train/raven/15533fc5e7e9d186d2.jpg inflating: dataset/train/raven/168119b3653b642703.jpg inflating: dataset/train/raven/16f1a7bb34483e1615.jpg inflating: dataset/train/raven/186dfc1acba43a142e.jpg inflating: dataset/train/raven/18b4dbdeaa2558e1fc.jpg inflating: dataset/train/raven/191bc3ef44ef4e9aa3.jpg inflating: dataset/train/raven/1a65308d6d49c6fcb4.jpg inflating: dataset/train/raven/1cd97e3a3096810bd4.png inflating: dataset/train/raven/1cfa8f687abd59d121.jpg inflating: dataset/train/raven/1e41893b62a80b67a9.jpg inflating: dataset/train/raven/1e4d8cc05a5bd218c3.jpg inflating: dataset/train/raven/1eab123177d2d7a7c4.jpeg inflating: dataset/train/raven/1f13013e7a9de97790.JPG inflating: dataset/train/raven/1f39f1bcf52bda7436.jpg inflating: dataset/train/raven/1fa2509bd108abd362.jpg inflating: dataset/train/raven/1faf7315a07b8f3c38.jpg inflating: dataset/train/raven/1fca756d45aad2897e.jpg inflating: dataset/train/raven/1ff140963afd387963.jpg inflating: dataset/train/raven/20c52a291c8f72b506.jpg inflating: dataset/train/raven/2107865080476a5e91.jpg inflating: dataset/train/raven/21f005b8d1b518e394.jpg inflating: dataset/train/raven/2329d2c81915217213.gif inflating: dataset/train/raven/2368914ad1a01d8274.jpg inflating: dataset/train/raven/246bed2bb4d4c0e0cb.jpg inflating: dataset/train/raven/24b8cc78caae4fb150.jpg inflating: dataset/train/raven/24c3e559ca283d037f.jpg inflating: dataset/train/raven/259640b3fb8baf1814.jpg inflating: dataset/train/raven/25cd5a812a4e9e605a.jpg inflating: dataset/train/raven/25d93371705323f500.jpg inflating: dataset/train/raven/25e833f7418b807043.jpg inflating: dataset/train/raven/25f1a3a021b3f50402.jpg inflating: dataset/train/raven/2600bab68d31a10a4d.jpg inflating: dataset/train/raven/2698f3a83718071b49.jpg inflating: dataset/train/raven/27afe014a45b2fae27.jpg inflating: dataset/train/raven/27d60bf9f049b14b81.jpg inflating: dataset/train/raven/2811290bc43ce9ea31.jpg inflating: dataset/train/raven/2873903496b6800981.jpg inflating: dataset/train/raven/288cdfcd4f89a2820f.jpg inflating: dataset/train/raven/289b739fb65de97180.png inflating: dataset/train/raven/28e808857fa900daec.jpg inflating: dataset/train/raven/2921b3866746e499dc.jpg inflating: dataset/train/raven/299bea1e8ae212e90e.jpg inflating: dataset/train/raven/29dbad706177cd9d0e.jpg inflating: dataset/train/raven/2c5a06f24bf39968d7.jpg inflating: dataset/train/raven/2cd5fd25cf1a5cf61a.jpg inflating: dataset/train/raven/2cec3860398065fba7.jpg inflating: dataset/train/raven/2d1531077946d27860.jpg inflating: dataset/train/raven/2d16cf18efcdda77a9.jpg inflating: dataset/train/raven/2d88019abd0ec48e53.png inflating: dataset/train/raven/2e3c76142bc7efe156.jpg inflating: dataset/train/raven/2e61e7f8bda5006fb2.jpg inflating: dataset/train/raven/2e9ec3cb5a8d8ee6b7.png inflating: dataset/train/raven/2ee6107c54ffaa119f.jpg inflating: dataset/train/raven/2f8abc926f3ef0cf6c.jpg inflating: dataset/train/raven/3057d308a8b5a3751a.jpg inflating: dataset/train/raven/30d5460d4ad5a97d5f.jpg inflating: dataset/train/raven/31014b66645fb33035.jpg inflating: dataset/train/raven/3124fc2125c0a7a629.jpg inflating: dataset/train/raven/31349157d90edba5c3.jpg inflating: dataset/train/raven/31eb20ebbd3dc60e85.jpg inflating: dataset/train/raven/32d3461c590051a159.png inflating: dataset/train/raven/330138d78c9ffa472c.jpg inflating: dataset/train/raven/33c59aab223b407d51.jpg inflating: dataset/train/raven/33cfde058aa99dc76c.jpg inflating: dataset/train/raven/33df9633567c7de4a7.jpg inflating: dataset/train/raven/34177f495b07d9f621.jpg inflating: dataset/train/raven/3442a3544b07282872.jpg inflating: dataset/train/raven/34f8acf85190266b99.jpg inflating: dataset/train/raven/35ef8623ea0f58e8dc.jpg inflating: dataset/train/raven/36584df5931e544d46.jpg inflating: dataset/train/raven/3677698eac58717acb.jpg inflating: dataset/train/raven/3685b71702d746f4dd.jpg inflating: dataset/train/raven/370d91a9163a3ba2d2.jpg inflating: dataset/train/raven/370dba20a58f79c64a.jpg inflating: dataset/train/raven/379db7793f5bd9c571.jpg inflating: dataset/train/raven/38e9f75775d0faaf5d.jpg inflating: dataset/train/raven/38ebf35b54a7beae47.jpg inflating: dataset/train/raven/38f7808d584deaf24e.jpg inflating: dataset/train/raven/398e3d5baffc52b4ab.jpg inflating: dataset/train/raven/3b1c6665f688a360d2.jpg inflating: dataset/train/raven/3b650fe336f5d09508.jpg inflating: dataset/train/raven/3ba4a8e7ad2360b443.jpg inflating: dataset/train/raven/3c1fb0b9dd98df01c7.jpg inflating: dataset/train/raven/3c5d9167f311a5fcfd.jpg inflating: dataset/train/raven/3ce5c6116babdd774a.jpg inflating: dataset/train/raven/3d0a4b7ae4a90d0f19.jpg inflating: dataset/train/raven/3d4ef4352e4d16a10a.jpg inflating: dataset/train/raven/3e35b43b73e3114922.jpg inflating: dataset/train/raven/3e35cbc885589c5924.jpg inflating: dataset/train/raven/3e6e1d566b698ca32f.jpg inflating: dataset/train/raven/3e86c2138158f13864.jpg inflating: dataset/train/raven/3eb99e8bdab92f1002.jpg inflating: dataset/train/raven/3eeca97ae3e2169856.png inflating: dataset/train/raven/3fc8f5f0a6982e89c3.jpg inflating: dataset/train/raven/3fe7446c597c0e130d.jpg inflating: dataset/train/raven/410126f1b61e76c26e.jpg inflating: dataset/train/raven/418365472123dee2ad.jpg inflating: dataset/train/raven/421e75139b4b2621b7.jpg inflating: dataset/train/raven/42590be26a47215477.jpg inflating: dataset/train/raven/430c0fd87f6718f4d5.jpg inflating: dataset/train/raven/43599b7d4e22504dab.jpg inflating: dataset/train/raven/4653390357ae802c3a.jpg inflating: dataset/train/raven/47e78eff31a1b5b7ad.jpg inflating: dataset/train/raven/4821875ba7e3091de8.png inflating: dataset/train/raven/489df24d3b2a0c2679.jpg inflating: dataset/train/raven/49a48940ce17af6a17.jpg inflating: dataset/train/raven/4a377e57c10ee1e18b.jpg inflating: dataset/train/raven/4ac84a062ea82edfb3.jpg inflating: dataset/train/raven/4b36680e32fb84e676.jpg inflating: dataset/train/raven/4b3cdbd002f5845bd3.jpg inflating: dataset/train/raven/4c3ae42401aa8b4957.jpg inflating: dataset/train/raven/4f1faa1924eed114ef.jpg inflating: dataset/train/raven/4f7aea6eb208822d17.jpg inflating: dataset/train/raven/4f9f9cb2e01f42ca6b.jpg inflating: dataset/train/raven/501ace345deb3c2328.jpg inflating: dataset/train/raven/505fa031e432595769.jpg inflating: dataset/train/raven/508096fcc4709812a8.jpg inflating: dataset/train/raven/520a1af33951bf32e3.jpg inflating: dataset/train/raven/52c81b5afba5f6002e.jpg inflating: dataset/train/raven/531ab29ed45d4164f1.jpg inflating: dataset/train/raven/53e0f2e057bcd29680.png inflating: dataset/train/raven/53e309a54f2ef9b7d9.jpg inflating: dataset/train/raven/5435fe1f2f6f3dfd8c.jpg inflating: dataset/train/raven/54b0ccac87c38a1f4a.jpg inflating: dataset/train/raven/55b45a501f5f41740c.jpg inflating: dataset/train/raven/56a453a665597b705f.jpeg inflating: dataset/train/raven/57144c105164e24374.jpg inflating: dataset/train/raven/577d2da59b888b8976.jpg inflating: dataset/train/raven/57d2710bd984bc2536.jpg inflating: dataset/train/raven/5a7a43085991c25e0b.jpg inflating: dataset/train/raven/5ab7d6759c54aab672.jpg inflating: dataset/train/raven/5ae4af68bb40cdf716.jpg inflating: dataset/train/raven/5b3b46bf153d87d614.jpg inflating: dataset/train/raven/5b459a1b7d6ffc7b1e.jpg inflating: dataset/train/raven/5b529f5e82f3157bc0.jpg inflating: dataset/train/raven/5bbc541cb33fe910d0.jpg inflating: dataset/train/raven/5bfef08c4c1560d39f.png inflating: dataset/train/raven/5c17cd2f06a2bafe38.jpg inflating: dataset/train/raven/5c1a708be5d62e5837.gif inflating: dataset/train/raven/5c8f5599941929afc2.jpg inflating: dataset/train/raven/5e14d5ccbb44ca8708.jpg inflating: dataset/train/raven/5efec2b57bab3c8a01.jpg inflating: dataset/train/raven/5f1f5cc810968397b2.jpg inflating: dataset/train/raven/5fbe8dd06ee39802ba.jpg inflating: dataset/train/raven/608719fa2f9b314669.jpg inflating: dataset/train/raven/60aea9e0514529d92f.jpg inflating: dataset/train/raven/60bba2bcd887350f57.jpg inflating: dataset/train/raven/617401d190324a26cf.jpg inflating: dataset/train/raven/618bff16235325b058.jpg inflating: dataset/train/raven/622115a34e33da48c7.jpg inflating: dataset/train/raven/6260ae1d72fbeb47fa.jpg inflating: dataset/train/raven/62867fb2bf72d7a61c.jpg inflating: dataset/train/raven/62bdc24064251d5945.jpg inflating: dataset/train/raven/634acb1f5502c67e85.jpg inflating: dataset/train/raven/643950b5b8c0bd56ca.jpg inflating: dataset/train/raven/649f1c4e914c458eb9.jpg inflating: dataset/train/raven/6563d4055899656793.jpg inflating: dataset/train/raven/669a332a8880990d8e.jpg inflating: dataset/train/raven/66edbad03953fa6d36.jpg inflating: dataset/train/raven/670db5d8bed651d66e.jpg inflating: dataset/train/raven/670eb4949e1e4f0cac.jpg inflating: dataset/train/raven/6747c3bd06fffa53cf.jpg inflating: dataset/train/raven/67bc4a8f66bc46375a.jpg inflating: dataset/train/raven/688e68bf55b7d191ea.jpg inflating: dataset/train/raven/68e4fe64ac655a02f5.png inflating: dataset/train/raven/69a442876b38ad03ae.jpg inflating: dataset/train/raven/69bd0880386dccc301.jpg inflating: dataset/train/raven/69dedf0a62852fbc76.jpg inflating: dataset/train/raven/6a828a80f5e667bb6c.PNG inflating: dataset/train/raven/6a8e31c6a47f28be1e.jpg inflating: dataset/train/raven/6bcc13f4f7c5c80dbe.jpg inflating: dataset/train/raven/6bf69c44acd22fe961.jpg inflating: dataset/train/raven/6c8985c581f13167f6.jpg inflating: dataset/train/raven/6cd4a0486e22b26f95.jpg inflating: dataset/train/raven/6cf5bd334f84e45f94.jpg inflating: dataset/train/raven/6d26af23d1dfadfe23.JPG inflating: dataset/train/raven/6d6de6a8c0a4c368ad.jpg inflating: dataset/train/raven/6dee782211b374c453.jpg inflating: dataset/train/raven/6e572837a0a22578e4.jpg inflating: dataset/train/raven/6fc2f8701085944df3.jpg inflating: dataset/train/raven/703e6730e0dfa2d39c.jpg inflating: dataset/train/raven/70bbe9c4a8212cd95d.jpg inflating: dataset/train/raven/711e0de3731be6fae3.jpg inflating: dataset/train/raven/711edc387794a09758.jpg inflating: dataset/train/raven/720b6ae38da6093581.jpg inflating: dataset/train/raven/72d46300616a2575ac.jpg inflating: dataset/train/raven/737511a34f0699b17b.jpg inflating: dataset/train/raven/753777d71ae5b28407.jpg inflating: dataset/train/raven/75bda91980409c09da.jpg inflating: dataset/train/raven/75d5973beb0b6da8a2.jpg inflating: dataset/train/raven/775f42cc4d6efd1b23.jpg inflating: dataset/train/raven/778c54cb9f792bb7e8.jpg inflating: dataset/train/raven/7a345d09b95be6bb5d.jpg inflating: dataset/train/raven/7a6c9e033b08804173.png inflating: dataset/train/raven/7a7db9d3d633383465.jpg inflating: dataset/train/raven/7a9138d965b03c6d48.jpeg inflating: dataset/train/raven/7b0382663f1fe112d5.jpg inflating: dataset/train/raven/7b5927b91f6c1d5957.jpg inflating: dataset/train/raven/7b97169b8e9532e2eb.jpg inflating: dataset/train/raven/7e0d60bba07eab8bc5.jpg inflating: dataset/train/raven/7e4241cdd2f829f24e.jpg inflating: dataset/train/raven/7f2f015231804ca16a.jpg inflating: dataset/train/raven/7fe341d5dd62a7d907.jpg inflating: dataset/train/raven/800142d3ad3a1bc46e.jpg inflating: dataset/train/raven/8044351198ba65c90d.jpg inflating: dataset/train/raven/80665daacb334beaa7.jpg inflating: dataset/train/raven/81953227040126c319.jpg inflating: dataset/train/raven/821fc620b61f4855d4.jpg inflating: dataset/train/raven/82da5321120181c573.jpg inflating: dataset/train/raven/82f446de7940e657c1.jpg inflating: dataset/train/raven/82fb06677490a02b1b.jpg inflating: dataset/train/raven/831221cea286036aa3.jpg inflating: dataset/train/raven/8320530104abcf1ea6.jpg inflating: dataset/train/raven/8361a8848a01a743dd.jpg inflating: dataset/train/raven/8472c0abbdb9ca6275.jpg inflating: dataset/train/raven/857d7a6cfdc1c4c4a2.jpg inflating: dataset/train/raven/8688760a2da27a5481.jpg inflating: dataset/train/raven/86de04e291fc23e9e6.jpeg inflating: dataset/train/raven/8765d610c12dd75721.jpg inflating: dataset/train/raven/88e5497213e9c56390.jpg inflating: dataset/train/raven/88f6b122ae52827a6e.jpg inflating: dataset/train/raven/8916a65389dcaaceb2.jpg inflating: dataset/train/raven/89551858c1cbb2d737.jpg inflating: dataset/train/raven/8992f0b667b34b24b2.jpg inflating: dataset/train/raven/89b2da572643d0f399.jpg inflating: dataset/train/raven/8a48ec446cdd252803.png inflating: dataset/train/raven/8a9f42f3b2bd443bb2.jpg inflating: dataset/train/raven/8adc919ce108825f89.jpg inflating: dataset/train/raven/8b92812297fc69d00c.jpg inflating: dataset/train/raven/8ba35c2203a75cde60.jpg inflating: dataset/train/raven/8c2c403173c217a6cd.jpg inflating: dataset/train/raven/8c897dcd9218ef034e.jpg inflating: dataset/train/raven/8c9c16edfd66bbbbd9.jpg inflating: dataset/train/raven/8cf7d837063f4dc5fd.jpg inflating: dataset/train/raven/8d68cdf081576f859c.jpg inflating: dataset/train/raven/8e8758559b9f2a830e.jpg inflating: dataset/train/raven/8f01aa5dac8f3da7df.jpg inflating: dataset/train/raven/8f0b084732a4f9b714.jpg inflating: dataset/train/raven/8f581bb640f588796e.jpg inflating: dataset/train/raven/8f8144ba909b8ac6b1.png inflating: dataset/train/raven/8fe4adde12d77a9d3c.jpg inflating: dataset/train/raven/90b7f7db913173f456.jpg inflating: dataset/train/raven/911451abedfdebb155.jpg inflating: dataset/train/raven/922020a04f9c2de922.jpg inflating: dataset/train/raven/92cc5509a9a9b8e87f.jpg inflating: dataset/train/raven/9415948a8dc2a55ec8.jpg inflating: dataset/train/raven/944f4240e80106f7e7.jpg inflating: dataset/train/raven/94592d20f0945d7f79.jpg inflating: dataset/train/raven/9490ca81c197a507af.jpg inflating: dataset/train/raven/949c4e2a8ea967c9ad.jpg inflating: dataset/train/raven/94ac1ba3e701197f08.jpg inflating: dataset/train/raven/96453789d8edee7ea5.JPG inflating: dataset/train/raven/9739108fddc8d862f0.jpg inflating: dataset/train/raven/97b2938440683b55f4.jpg inflating: dataset/train/raven/98f2c724e6d815844f.jpg inflating: dataset/train/raven/9909954fbf08ccd374.jpg inflating: dataset/train/raven/9b5647442c6609aadc.jpg inflating: dataset/train/raven/9c83dd612088ff9ada.jpg inflating: dataset/train/raven/9cbe301113169def05.jpg inflating: dataset/train/raven/9cd8218d58ea226530.jpg inflating: dataset/train/raven/9d4ea892f3951e7adc.jpg inflating: dataset/train/raven/9e7299fd47a888e374.jpg inflating: dataset/train/raven/a0377277ce027642e1.jpg inflating: dataset/train/raven/a1a632a1a5c7207557.jpg inflating: dataset/train/raven/a27e15228a694b9808.png inflating: dataset/train/raven/a3183a4fb6ce9d8e71.jpg inflating: dataset/train/raven/a3251ff64c74f4266c.jpg inflating: dataset/train/raven/a37b67296bfc30eece.jpg inflating: dataset/train/raven/a3bb8b01d7fee8f53e.jpg inflating: dataset/train/raven/a41125059a9073e842.jpg inflating: dataset/train/raven/a48999dd4da7184aab.jpg inflating: dataset/train/raven/a688e17cd5be545f88.jpg inflating: dataset/train/raven/a78b9760fe76c8a034.png inflating: dataset/train/raven/a7b6a60d9a49cd015b.jpg inflating: dataset/train/raven/a86892085685c703e6.png inflating: dataset/train/raven/a9dd1173084b3426f3.jpeg inflating: dataset/train/raven/aa2e5d66503aa35355.png inflating: dataset/train/raven/aa7b858790466a2330.jpg inflating: dataset/train/raven/aad48d837beab321c3.jpg inflating: dataset/train/raven/ac18783d82c74d3876.jpg inflating: dataset/train/raven/ac355af8dada3470b7.jpg inflating: dataset/train/raven/ac6e896af137b32e6f.png inflating: dataset/train/raven/ac7fae8c8b4908ba72.jpg inflating: dataset/train/raven/aca4c284a7188f6c79.jpg inflating: dataset/train/raven/ad670f396ffe405dc5.jpg inflating: dataset/train/raven/ae50d205ef68390adb.jpg inflating: dataset/train/raven/ae96e1111ef9f7bf7e.jpg inflating: dataset/train/raven/aeba36fe5e274efd02.jpg inflating: dataset/train/raven/afbff97c86c0e9266f.jpg inflating: dataset/train/raven/b08b8ba3b1aee9d093.jpg inflating: dataset/train/raven/b17c7f11aa827edb33.jpg inflating: dataset/train/raven/b26d3437d0db2c037f.jpg inflating: dataset/train/raven/b30afcc7e7c8c96cfe.jpg inflating: dataset/train/raven/b337b04bf06aadd13f.jpg inflating: dataset/train/raven/b37033313757fdfdbd.png inflating: dataset/train/raven/b429790b7a4a31ea37.jpg inflating: dataset/train/raven/b4cc4f3366f3e9cd05.png inflating: dataset/train/raven/b5635ba405cd4f807c.jpg inflating: dataset/train/raven/b58295427d0043cdac.jpg inflating: dataset/train/raven/b5bfdfd765aa4d6b3e.jpg inflating: dataset/train/raven/b750c51df84ee9f444.jpg inflating: dataset/train/raven/b771e50ec804b51113.jpg inflating: dataset/train/raven/b7781fccd61c99fbe5.jpg inflating: dataset/train/raven/b7f050288765b8c3ac.jpg inflating: dataset/train/raven/b900688dcee761e501.jpg inflating: dataset/train/raven/b903d61cf1a0d0570a.jpg inflating: dataset/train/raven/ba0245ebeab37a69e3.jpg inflating: dataset/train/raven/bad98b8283ee41f2d9.jpg inflating: dataset/train/raven/bb03d188f07b97ce4a.JPG inflating: dataset/train/raven/bbddb306d90ebcaab3.jpg inflating: dataset/train/raven/bbfa87553720343658.jpg inflating: dataset/train/raven/bc1f4aaaf892fbf95b.jpg inflating: dataset/train/raven/bc4c69046f1d376e9b.jpg inflating: dataset/train/raven/bc5b161027d00271e8.jpg inflating: dataset/train/raven/bcfc9b04e1d9d7ae50.jpg inflating: dataset/train/raven/bd5980a9bbfd1da0a4.jpeg inflating: dataset/train/raven/bddb774812343cf652.jpg inflating: dataset/train/raven/be42355239bd3fc1be.png inflating: dataset/train/raven/bea43b28d52aa870d0.jpg inflating: dataset/train/raven/bea8e9afb111aaecb0.jpg inflating: dataset/train/raven/c13fc9691dee15242f.jpg inflating: dataset/train/raven/c2208bd05229470ce7.jpg inflating: dataset/train/raven/c2b3148031c845d0bd.jpg inflating: dataset/train/raven/c2e4823c808b9643d2.jpg inflating: dataset/train/raven/c392711541dc58d281.png inflating: dataset/train/raven/c3d8c1ec40c541ccaa.jpeg inflating: dataset/train/raven/c456b5b8747b61f85e.jpg inflating: dataset/train/raven/c564b8271a4b073660.jpg inflating: dataset/train/raven/c59e992a75733afe96.jpg inflating: dataset/train/raven/c6372efbedc93135fb.jpg inflating: dataset/train/raven/c73845d3b64b909bfd.jpg inflating: dataset/train/raven/c7eb2e392cbd4dd870.png inflating: dataset/train/raven/c7edc0902bac2db35b.jpg inflating: dataset/train/raven/ca0234405a383cff99.jpg inflating: dataset/train/raven/ca37f29b3c797f4818.jpg inflating: dataset/train/raven/cb92e1558f3dcc2483.jpg inflating: dataset/train/raven/cb9d44b236d1417517.jpg inflating: dataset/train/raven/ccb5393d267e45268a.jpg inflating: dataset/train/raven/cd32cb9c26d8903eb4.jpg inflating: dataset/train/raven/ce105cea7e3b0862e0.jpg inflating: dataset/train/raven/ce2e8f3fc6a6654a14.jpg inflating: dataset/train/raven/ce47173e207a250b73.gif inflating: dataset/train/raven/ce80fa6ba6af322146.jpg inflating: dataset/train/raven/d0648cc44581e37c33.jpg inflating: dataset/train/raven/d0c8f799547bd91de5.jpg inflating: dataset/train/raven/d0ee27dceebf6effd0.jpg inflating: dataset/train/raven/d17fe1c5f2559c56ae.jpg inflating: dataset/train/raven/d1c7d9917601b2583c.jpg inflating: dataset/train/raven/d1d71923663c83b1c8.jpg inflating: dataset/train/raven/d2c3dea56129bfe962.png inflating: dataset/train/raven/d46ee0f944a3ee8ec6.png inflating: dataset/train/raven/d4ccb4c9b8b9c7ccbc.jpg inflating: dataset/train/raven/d54e667804cbd98022.jpg inflating: dataset/train/raven/d639e1894fc2b2bdc0.jpg inflating: dataset/train/raven/d63c54787122a5e175.jpg inflating: dataset/train/raven/d66c7d39f160aa9f97.jpg inflating: dataset/train/raven/d7ce404a070e731293.gif inflating: dataset/train/raven/d8d14d4180e04cbf6a.jpg inflating: dataset/train/raven/dae1a17d5f1f9aa3e6.jpg inflating: dataset/train/raven/dbee7fe2189d16d87a.jpg inflating: dataset/train/raven/dbfaafce93ca805e86.jpg inflating: dataset/train/raven/dbfdc712d1efd840c0.jpg inflating: dataset/train/raven/dc694265ef51e8f5f9.jpg inflating: dataset/train/raven/dc7c2e4161d2cd2527.jpg inflating: dataset/train/raven/dd300d624d057c9b6b.jpg inflating: dataset/train/raven/dd565fe97d04c1b767.jpg inflating: dataset/train/raven/ddb75d8e1538c807eb.png inflating: dataset/train/raven/de42238210e57a37be.jpg inflating: dataset/train/raven/de609d995b0cfc775e.jpg inflating: dataset/train/raven/de67862ccee44e6f59.jpg inflating: dataset/train/raven/e001ed7f660866f14f.png inflating: dataset/train/raven/e0f9cd0f106eb992f6.jpg inflating: dataset/train/raven/e1287544223883ed3f.jpg inflating: dataset/train/raven/e221317c9778265a5d.jpg inflating: dataset/train/raven/e331f65385e626833c.jpg inflating: dataset/train/raven/e3b6efe6deab18c241.JPG inflating: dataset/train/raven/e402b0a7a21e4b54b8.jpg inflating: dataset/train/raven/e4208feea59371a91b.jpg inflating: dataset/train/raven/e602a2c44037abe1cc.jpg inflating: dataset/train/raven/e645c9d47afdbf3478.jpg inflating: dataset/train/raven/e65a7f4c670f2eda68.jpg inflating: dataset/train/raven/e689900c94f6816c53.jpg inflating: dataset/train/raven/e6df52d706bdefa30b.jpg inflating: dataset/train/raven/e76b8c8e859bf2449d.jpg inflating: dataset/train/raven/e7ac12e06ee4c673d8.JPG inflating: dataset/train/raven/e84d72f4285640ba4c.JPG inflating: dataset/train/raven/e8a1e5d0901f9d8cf4.jpg inflating: dataset/train/raven/e8cca9d624186c6a2a.jpg inflating: dataset/train/raven/e970f4fdc7d3086085.jpg inflating: dataset/train/raven/ea9ac3440fbd01d6a4.jpg inflating: dataset/train/raven/eb421440377ad756c7.jpg inflating: dataset/train/raven/ebbb4576d8d01f005d.jpg inflating: dataset/train/raven/ec56a06ab98fd6f7bb.jpg inflating: dataset/train/raven/ee16e7ca93c0a7151d.jpg inflating: dataset/train/raven/ee6acb772b93bac5de.jpg inflating: dataset/train/raven/ef48fa20bb454daca4.jpg inflating: dataset/train/raven/f0283ba06a9eccc364.jpg inflating: dataset/train/raven/f05e16f15d232a0170.jpg inflating: dataset/train/raven/f3038ff5bc7b583113.jpg inflating: dataset/train/raven/f320d9487cc871f581.jpg inflating: dataset/train/raven/f3331de3e56c0cf756.jpg inflating: dataset/train/raven/f4805da6bfeb2f6e0d.jpg inflating: dataset/train/raven/f656b2388fa0579664.jpg inflating: dataset/train/raven/f6aae23d11b76fe667.jpg inflating: dataset/train/raven/f769d8b3dc7864d7f3.jpg inflating: dataset/train/raven/f996cd44b6fdf678a7.jpg inflating: dataset/train/raven/f9b70b572762a6d864.jpg inflating: dataset/train/raven/f9eb5a105534bcdf60.jpg inflating: dataset/train/raven/fa4b1f29b5fd5280df.png inflating: dataset/train/raven/fbaec72b62c511db59.jpg inflating: dataset/train/raven/fbfe01dc7c276f87f2.jpg inflating: dataset/train/raven/fc05bba1d32b72f2f1.jpg inflating: dataset/train/raven/fccf7399ca772207d2.jpg inflating: dataset/train/raven/fcf231113ec950faa7.jpg inflating: dataset/train/raven/fd95d06679c49b3a36.jpg inflating: dataset/train/raven/fde5e9a9d79408ea77.jpg inflating: dataset/train/raven/ff44418bbfe1f36e64.jpg inflating: dataset/train/raven/ffcdb314a4ccb8db2b.jpg
data_generator = preprocessing.image.ImageDataGenerator(
validation_split=0.3,
rescale=1/255.0)
un_shuffled_train_data = data_generator.flow_from_directory(
"/content/dataset/train",
target_size=(100, 100),
subset='training',
color_mode="grayscale",
class_mode='categorical',
batch_size=32,
shuffle=False
)
train_data = data_generator.flow_from_directory(
"/content/dataset/train",
target_size=(100, 100),
subset='training',
color_mode="grayscale",
class_mode='categorical',
batch_size=32
)
un_shuffled_validation_data = data_generator.flow_from_directory(
"/content/dataset/train",
target_size=(100, 100),
subset='validation',
color_mode="grayscale",
class_mode='categorical',
batch_size=32,
shuffle=False
)
validation_data = data_generator.flow_from_directory(
"/content/dataset/train",
target_size=(100, 100),
subset='validation',
color_mode="grayscale",
class_mode='categorical',
batch_size=32
)
Found 1363 images belonging to 4 classes. Found 1363 images belonging to 4 classes. Found 582 images belonging to 4 classes. Found 582 images belonging to 4 classes.
I used 30% of total train data for validation set. As we can see there are 1363 data for training and 582 data for validation set. There are 4 classes for each.
validation_data.class_indices
{'bald_eagle': 0, 'elk': 1, 'racoon': 2, 'raven': 3}
train_classes_freq = np.bincount(train_data.classes)
validation_classes_freq = np.bincount(validation_data.classes)
labels = ["Bald Eagle", "Elk", "Raccoon", "Raven"]
y_pos = np.arange(len(labels))
width = 0.2
fig = plt.figure()
fig.set_figheight(6)
fig.set_figwidth(10)
plt.bar(y_pos - width/2, train_classes_freq, width, label='Train')
plt.bar(y_pos + width/2, validation_classes_freq, width, label='Validation')
plt.xticks(y_pos, labels)
plt.title("Frequency of Classes in Dataset")
plt.ylabel("Frequncy")
plt.xlabel("Class")
plt.legend()
fig.tight_layout()
plt.show()
img = preprocessing.image.load_img("/content/dataset/train/bald_eagle/d7a59b6e4b8307a9ec.jpg", target_size=(100, 100))
imgplot = plt.imshow(img)
plt.title("Bald Eagle")
plt.show()
img = preprocessing.image.load_img("/content/dataset/train/elk/71d43626a126a4c8c0.jpg", target_size=(100, 100))
imgplot = plt.imshow(img)
plt.title("ELK")
plt.show()
img = preprocessing.image.load_img("/content/dataset/train/racoon/5ed82b2163f6fdc2ec.jpg", target_size=(100, 100))
imgplot = plt.imshow(img)
plt.title("Raccoon")
plt.show()
img = preprocessing.image.load_img("/content/dataset/train/raven/bc1f4aaaf892fbf95b.jpg", target_size=(100, 100))
imgplot = plt.imshow(img)
plt.title("Raven")
plt.show()
One hot encoding: For categorical variables where no such ordinal relationship exists, One hot hot encoding is better than integer encoding. Otherwise it clus lead us to poor performance. This type of encoding creates a new binary feature for each possible category and assigns a value of 1 to the feature of each sample that corresponds to its original category. One-Hot-Encoding has the advantage that the result is binary rather than ordinal and that everything sits in an orthogonal vector space. Since there is no quantitative relationship between nominal variables' individual values, using ordinal encoding can potentially create a fictional ordinal relationship in the data.[8] Therefore, one-hot encoding is often applied to nominal variables, in order to improve the performance of the algorithm. In this project as the classes are not ordinal and the number of classes are not alot, using one hot encoding is good. Alse, by using one hot encoding here, sumhow we are preventing our model to get biased. By using other types of encoding like integer encoding we are settin a rank between classes, but here there is no rank between animals so one hot encoding is a better encoding method.
input = keras.layers.Input(shape=(100,100,1))
output = keras.layers.Flatten()(input)
output = keras.layers.Dense(2048, activation='relu')(output)
output = keras.layers.Dense(1024, activation='relu')(output)
output = keras.layers.Dense(4, activation='softmax')(output)
model = keras.models.Model(inputs=input, outputs=output)
model.compile(
optimizer=keras.optimizers.SGD(learning_rate=0.01),
loss='categorical_crossentropy',
metrics=['accuracy']
)
model.summary()
history_relu = model.fit(train_data, validation_data=validation_data , epochs=10)
Model: "model_34"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
input_35 (InputLayer) [(None, 100, 100, 1)] 0
flatten_34 (Flatten) (None, 10000) 0
dense_103 (Dense) (None, 2048) 20482048
dense_104 (Dense) (None, 1024) 2098176
dense_105 (Dense) (None, 4) 4100
=================================================================
Total params: 22,584,324
Trainable params: 22,584,324
Non-trainable params: 0
_________________________________________________________________
/usr/local/lib/python3.7/dist-packages/PIL/Image.py:960: UserWarning: Palette images with Transparency expressed in bytes should be converted to RGBA images "Palette images with Transparency expressed in bytes should be "
Epoch 1/10 49/49 [==============================] - 57s 1s/step - loss: 1.4738 - accuracy: 0.3687 - val_loss: 1.1629 - val_accuracy: 0.5180 Epoch 2/10 49/49 [==============================] - 55s 1s/step - loss: 1.1825 - accuracy: 0.4759 - val_loss: 1.2476 - val_accuracy: 0.4175 Epoch 3/10 49/49 [==============================] - 56s 1s/step - loss: 1.0932 - accuracy: 0.5299 - val_loss: 1.1289 - val_accuracy: 0.5361 Epoch 4/10 49/49 [==============================] - 55s 1s/step - loss: 1.0469 - accuracy: 0.5684 - val_loss: 1.1778 - val_accuracy: 0.3995 Epoch 5/10 49/49 [==============================] - 55s 1s/step - loss: 1.0180 - accuracy: 0.5845 - val_loss: 1.0175 - val_accuracy: 0.5464 Epoch 6/10 49/49 [==============================] - 55s 1s/step - loss: 0.9764 - accuracy: 0.6005 - val_loss: 1.1856 - val_accuracy: 0.4485 Epoch 7/10 49/49 [==============================] - 55s 1s/step - loss: 0.9149 - accuracy: 0.6288 - val_loss: 1.0739 - val_accuracy: 0.5464 Epoch 8/10 49/49 [==============================] - 55s 1s/step - loss: 0.9031 - accuracy: 0.6577 - val_loss: 0.9254 - val_accuracy: 0.6495 Epoch 9/10 49/49 [==============================] - 55s 1s/step - loss: 0.8655 - accuracy: 0.6551 - val_loss: 0.8906 - val_accuracy: 0.6418 Epoch 10/10 49/49 [==============================] - 55s 1s/step - loss: 0.8424 - accuracy: 0.6802 - val_loss: 1.0554 - val_accuracy: 0.5464
Here based on the summary, the parameter 20482048 comes from here:
100 * 100 * 2048 + 2048(number of biases) = 2048
and the parameter 2098176 comes from here:
2048 * 1024 + 1024 = 2098176
and the parameter 4100 comes from here:
1024 * 4 + 4 = 4100
def print_results(modell, train_generator, validation_generator):
print("----->TRAIN")
pred = np.argmax(modell.predict(train_generator, batch_size=32), axis=1)
print(classification_report(train_generator.labels, pred))
print("---->Validation")
pred = np.argmax(modell.predict(validation_generator, batch_size=32), axis=1)
print(classification_report(validation_generator.labels, pred))
print_results(model,un_shuffled_train_data, un_shuffled_validation_data )
def plot_results(hist):
fig = plt.figure()
fig.set_figheight(5)
fig.set_figwidth(8)
plt.plot(hist.history["accuracy"], label = "Train")
plt.plot(hist.history["val_accuracy"], label ="Validation")
plt.title("Accuracy of Train and validation Data")
plt.xlabel("Epoch")
plt.ylabel("Accuracy")
plt.legend()
plt.show()
fig = plt.figure()
fig.set_figheight(5)
fig.set_figwidth(8)
plt.plot(hist.history["loss"], label = "Train")
plt.plot(hist.history["val_loss"], Label = "Validation")
plt.title("Loss of Train and validation Data")
plt.xlabel("Epoch")
plt.ylabel("Loss")
plt.legend()
plt.show()
plot_results(history_relu)
In the context of SGD, instead of computing the exact derivate of our loss function, we’re approximating it on small batches in an iterative fashion. Hence, it is not certain that the model learns in a direction where the loss is minimized. To propose more stable, direction-aware, and fast learning, we introduce SGDMomentum that determines the next update as a linear combination of the gradient and the previous update. Hence, it takes into account the previous updates also. In general, momentum stochastic gradient descent provides 2 certain advantages over classical one:
momentum coefficient takes values in [0,1]. It is an exponential decay factor that determines the relative contribution of the current gradient and earlier gradients to the weight change [1]. A momentum of 0.0 is the same as gradient descent without momentum. In the case of momentum = 1, the optimization process takes into account the full history of the previous update.
The method of momentum is designed to accelerate learning, especially in the face of high curvature, small but consistent gradients, or noisy gradients. Also, it helps us not to get in local minima.
Momentum = 0.5
input = keras.layers.Input(shape=(100,100,1))
output = keras.layers.Flatten()(input)
output = keras.layers.Dense(4096, activation='relu')(output)
output = keras.layers.Dense(2048, activation='relu')(output)
output = keras.layers.Dense(4, activation='softmax')(output)
model = keras.models.Model(inputs=input, outputs=output)
model.compile(
optimizer=keras.optimizers.SGD(learning_rate=0.01, momentum=0.5),
loss='categorical_crossentropy',
metrics=['accuracy']
)
model.summary()
history_relu_momentum05 = model.fit(train_data, validation_data=validation_data , epochs=10)
Model: "model_26"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
input_27 (InputLayer) [(None, 128, 128, 1)] 0
flatten_26 (Flatten) (None, 16384) 0
dense_79 (Dense) (None, 4096) 67112960
dense_80 (Dense) (None, 2048) 8390656
dense_81 (Dense) (None, 4) 8196
=================================================================
Total params: 75,511,812
Trainable params: 75,511,812
Non-trainable params: 0
_________________________________________________________________
Epoch 1/10
/usr/local/lib/python3.7/dist-packages/PIL/Image.py:960: UserWarning: Palette images with Transparency expressed in bytes should be converted to RGBA images "Palette images with Transparency expressed in bytes should be "
49/49 [==============================] - 68s 1s/step - loss: 1.5427 - accuracy: 0.3802 - val_loss: 1.1382 - val_accuracy: 0.5619 Epoch 2/10 49/49 [==============================] - 66s 1s/step - loss: 1.1629 - accuracy: 0.5138 - val_loss: 1.1808 - val_accuracy: 0.5000 Epoch 3/10 49/49 [==============================] - 66s 1s/step - loss: 1.0898 - accuracy: 0.5234 - val_loss: 1.0402 - val_accuracy: 0.5515 Epoch 4/10 49/49 [==============================] - 66s 1s/step - loss: 1.0269 - accuracy: 0.5761 - val_loss: 1.0164 - val_accuracy: 0.5619 Epoch 5/10 49/49 [==============================] - 67s 1s/step - loss: 0.9446 - accuracy: 0.6134 - val_loss: 0.9830 - val_accuracy: 0.6031 Epoch 6/10 49/49 [==============================] - 67s 1s/step - loss: 0.8934 - accuracy: 0.6493 - val_loss: 1.0178 - val_accuracy: 0.5541 Epoch 7/10 49/49 [==============================] - 67s 1s/step - loss: 0.8256 - accuracy: 0.6737 - val_loss: 0.9359 - val_accuracy: 0.6340 Epoch 8/10 49/49 [==============================] - 67s 1s/step - loss: 0.7747 - accuracy: 0.7026 - val_loss: 0.9158 - val_accuracy: 0.6546 Epoch 9/10 49/49 [==============================] - 70s 1s/step - loss: 0.7443 - accuracy: 0.7174 - val_loss: 0.8178 - val_accuracy: 0.6985 Epoch 10/10 49/49 [==============================] - 66s 1s/step - loss: 0.7406 - accuracy: 0.7290 - val_loss: 1.0442 - val_accuracy: 0.5851
print_results(model,un_shuffled_train_data, un_shuffled_validation_data)
plot_results(history_relu_momentum05)
----->TRAIN
/usr/local/lib/python3.7/dist-packages/PIL/Image.py:960: UserWarning: Palette images with Transparency expressed in bytes should be converted to RGBA images "Palette images with Transparency expressed in bytes should be "
precision recall f1-score support
0 0.55 0.59 0.57 417
1 0.61 0.86 0.71 370
2 0.73 0.76 0.74 406
3 0.81 0.37 0.51 364
accuracy 0.65 1557
macro avg 0.67 0.64 0.63 1557
weighted avg 0.67 0.65 0.63 1557
---->Validation
/usr/local/lib/python3.7/dist-packages/PIL/Image.py:960: UserWarning: Palette images with Transparency expressed in bytes should be converted to RGBA images "Palette images with Transparency expressed in bytes should be "
precision recall f1-score support
0 0.52 0.60 0.56 104
1 0.57 0.79 0.66 92
2 0.62 0.64 0.63 101
3 0.84 0.34 0.48 91
accuracy 0.60 388
macro avg 0.64 0.59 0.58 388
weighted avg 0.63 0.60 0.59 388
Momentum = 0.9
input = keras.layers.Input(shape=(100,100,1))
output = keras.layers.Flatten()(input)
output = keras.layers.Dense(4096, activation='relu')(output)
output = keras.layers.Dense(2048, activation='relu')(output)
output = keras.layers.Dense(4, activation='softmax')(output)
model = keras.models.Model(inputs=input, outputs=output)
model.compile(
optimizer=keras.optimizers.SGD(learning_rate=0.01, momentum=0.9),
loss='categorical_crossentropy',
metrics=['accuracy']
)
model.summary()
history_relu_momentum09 = model.fit(train_data, validation_data=validation_data , epochs=10)
Model: "model_27"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
input_28 (InputLayer) [(None, 128, 128, 1)] 0
flatten_27 (Flatten) (None, 16384) 0
dense_82 (Dense) (None, 4096) 67112960
dense_83 (Dense) (None, 2048) 8390656
dense_84 (Dense) (None, 4) 8196
=================================================================
Total params: 75,511,812
Trainable params: 75,511,812
Non-trainable params: 0
_________________________________________________________________
Epoch 1/10
9/49 [====>.........................] - ETA: 40s - loss: 3.8119 - accuracy: 0.2361
/usr/local/lib/python3.7/dist-packages/PIL/Image.py:960: UserWarning: Palette images with Transparency expressed in bytes should be converted to RGBA images "Palette images with Transparency expressed in bytes should be "
49/49 [==============================] - 68s 1s/step - loss: 1.9823 - accuracy: 0.3224 - val_loss: 1.3425 - val_accuracy: 0.3557 Epoch 2/10 49/49 [==============================] - 67s 1s/step - loss: 1.2700 - accuracy: 0.4027 - val_loss: 1.1830 - val_accuracy: 0.4485 Epoch 3/10 49/49 [==============================] - 67s 1s/step - loss: 1.1961 - accuracy: 0.4399 - val_loss: 1.1616 - val_accuracy: 0.4923 Epoch 4/10 49/49 [==============================] - 67s 1s/step - loss: 1.1419 - accuracy: 0.4618 - val_loss: 1.0936 - val_accuracy: 0.5103 Epoch 5/10 49/49 [==============================] - 67s 1s/step - loss: 1.1143 - accuracy: 0.5100 - val_loss: 1.0835 - val_accuracy: 0.5722 Epoch 6/10 49/49 [==============================] - 67s 1s/step - loss: 1.1042 - accuracy: 0.4952 - val_loss: 1.0842 - val_accuracy: 0.5515 Epoch 7/10 49/49 [==============================] - 66s 1s/step - loss: 1.0882 - accuracy: 0.5138 - val_loss: 1.0016 - val_accuracy: 0.5954 Epoch 8/10 49/49 [==============================] - 66s 1s/step - loss: 1.0038 - accuracy: 0.5601 - val_loss: 1.0973 - val_accuracy: 0.5284 Epoch 9/10 49/49 [==============================] - 66s 1s/step - loss: 1.0391 - accuracy: 0.5665 - val_loss: 1.0598 - val_accuracy: 0.6186 Epoch 10/10 49/49 [==============================] - 66s 1s/step - loss: 1.0147 - accuracy: 0.5742 - val_loss: 1.0147 - val_accuracy: 0.5954
print_results(model,un_shuffled_train_data, un_shuffled_validation_data )
plot_results(history_relu_momentum09)
----->TRAIN
/usr/local/lib/python3.7/dist-packages/PIL/Image.py:960: UserWarning: Palette images with Transparency expressed in bytes should be converted to RGBA images "Palette images with Transparency expressed in bytes should be "
precision recall f1-score support
0 0.55 0.59 0.57 417
1 0.61 0.86 0.71 370
2 0.73 0.76 0.74 406
3 0.81 0.37 0.51 364
accuracy 0.65 1557
macro avg 0.67 0.64 0.63 1557
weighted avg 0.67 0.65 0.63 1557
---->Validation
/usr/local/lib/python3.7/dist-packages/PIL/Image.py:960: UserWarning: Palette images with Transparency expressed in bytes should be converted to RGBA images "Palette images with Transparency expressed in bytes should be "
precision recall f1-score support
0 0.52 0.60 0.56 104
1 0.57 0.79 0.66 92
2 0.62 0.64 0.63 101
3 0.84 0.34 0.48 91
accuracy 0.60 388
macro avg 0.64 0.59 0.58 388
weighted avg 0.63 0.60 0.59 388
Question) is it always good to set momentum at high value?
No, generally to set momentum at most 0.9 would not be bad but by setting momentum at high value like 0.95 and also we set learning rate to a high value, we would have bigger steps, hence, the each step will be bigger. Therefore, weights in the network would not be updated in a correct way, and they will never reach optimal values. Therefore, large momentum does not necessarily lead to higher accuracies.
Adam
input = keras.layers.Input(shape=(100,100,1))
output = keras.layers.Flatten()(input)
output = keras.layers.Dense(4096, activation='relu')(output)
output = keras.layers.Dense(2048, activation='relu')(output)
output = keras.layers.Dense(4, activation='softmax')(output)
model = keras.models.Model(inputs=input, outputs=output)
model.compile(
optimizer=keras.optimizers.Adam(learning_rate=0.001),
loss='categorical_crossentropy',
metrics=['accuracy']
)
model.summary()
history_relu_adam = model.fit(train_data, validation_data=validation_data , epochs=10)
Model: "model_28"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
input_29 (InputLayer) [(None, 128, 128, 1)] 0
flatten_28 (Flatten) (None, 16384) 0
dense_85 (Dense) (None, 4096) 67112960
dense_86 (Dense) (None, 2048) 8390656
dense_87 (Dense) (None, 4) 8196
=================================================================
Total params: 75,511,812
Trainable params: 75,511,812
Non-trainable params: 0
_________________________________________________________________
Epoch 1/10
11/49 [=====>........................] - ETA: 55s - loss: 38.6875 - accuracy: 0.2557
/usr/local/lib/python3.7/dist-packages/PIL/Image.py:960: UserWarning: Palette images with Transparency expressed in bytes should be converted to RGBA images "Palette images with Transparency expressed in bytes should be "
49/49 [==============================] - 75s 2s/step - loss: 10.6317 - accuracy: 0.3192 - val_loss: 1.2812 - val_accuracy: 0.4330 Epoch 2/10 49/49 [==============================] - 69s 1s/step - loss: 1.2311 - accuracy: 0.4252 - val_loss: 1.1583 - val_accuracy: 0.5077 Epoch 3/10 49/49 [==============================] - 69s 1s/step - loss: 1.1995 - accuracy: 0.4566 - val_loss: 1.2555 - val_accuracy: 0.4562 Epoch 4/10 49/49 [==============================] - 69s 1s/step - loss: 1.1905 - accuracy: 0.4457 - val_loss: 1.2091 - val_accuracy: 0.4691 Epoch 5/10 49/49 [==============================] - 69s 1s/step - loss: 1.1220 - accuracy: 0.4881 - val_loss: 1.0681 - val_accuracy: 0.5000 Epoch 6/10 49/49 [==============================] - 69s 1s/step - loss: 1.0773 - accuracy: 0.5286 - val_loss: 1.0577 - val_accuracy: 0.5515 Epoch 7/10 49/49 [==============================] - 69s 1s/step - loss: 1.0570 - accuracy: 0.5594 - val_loss: 1.0809 - val_accuracy: 0.4923 Epoch 8/10 49/49 [==============================] - 69s 1s/step - loss: 1.0505 - accuracy: 0.5491 - val_loss: 1.1716 - val_accuracy: 0.4536 Epoch 9/10 49/49 [==============================] - 70s 1s/step - loss: 0.9647 - accuracy: 0.5896 - val_loss: 1.0277 - val_accuracy: 0.5515 Epoch 10/10 49/49 [==============================] - 69s 1s/step - loss: 0.9649 - accuracy: 0.5896 - val_loss: 0.9866 - val_accuracy: 0.5696
print_results(model,un_shuffled_train_data, un_shuffled_validation_data )
plot_results(history_relu_adam)
----->TRAIN
/usr/local/lib/python3.7/dist-packages/PIL/Image.py:960: UserWarning: Palette images with Transparency expressed in bytes should be converted to RGBA images "Palette images with Transparency expressed in bytes should be "
precision recall f1-score support
0 0.89 0.30 0.45 417
1 0.53 0.92 0.67 370
2 0.81 0.66 0.73 406
3 0.61 0.74 0.67 364
accuracy 0.64 1557
macro avg 0.71 0.66 0.63 1557
weighted avg 0.72 0.64 0.63 1557
---->Validation
/usr/local/lib/python3.7/dist-packages/PIL/Image.py:960: UserWarning: Palette images with Transparency expressed in bytes should be converted to RGBA images "Palette images with Transparency expressed in bytes should be "
precision recall f1-score support
0 0.76 0.28 0.41 104
1 0.45 0.85 0.59 92
2 0.69 0.50 0.58 101
3 0.61 0.69 0.65 91
accuracy 0.57 388
macro avg 0.63 0.58 0.56 388
weighted avg 0.63 0.57 0.55 388
The result is somehow the same with using SGD.
input = keras.layers.Input(shape=(100,100,1))
output = keras.layers.Flatten()(input)
output = keras.layers.Dense(4096, activation='relu')(output)
output = keras.layers.Dense(2048, activation='relu')(output)
output = keras.layers.Dense(4, activation='softmax')(output)
model = keras.models.Model(inputs=input, outputs=output)
model.compile(
optimizer=keras.optimizers.Adam(learning_rate=0.001),
loss='categorical_crossentropy',
metrics=['accuracy']
)
model.summary()
history_relu_adam_20_epoch = model.fit(train_data, validation_data=validation_data , epochs=20)
Model: "model_41"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
input_42 (InputLayer) [(None, 100, 100, 1)] 0
flatten_41 (Flatten) (None, 10000) 0
dense_124 (Dense) (None, 2048) 20482048
dense_125 (Dense) (None, 1024) 2098176
dense_126 (Dense) (None, 4) 4100
=================================================================
Total params: 22,584,324
Trainable params: 22,584,324
Non-trainable params: 0
_________________________________________________________________
Epoch 1/20
8/43 [====>.........................] - ETA: 25s - loss: 16.4025 - accuracy: 0.2617
/usr/local/lib/python3.7/dist-packages/PIL/Image.py:960: UserWarning: Palette images with Transparency expressed in bytes should be converted to RGBA images "Palette images with Transparency expressed in bytes should be "
43/43 [==============================] - 57s 1s/step - loss: 4.8584 - accuracy: 0.3199 - val_loss: 1.4315 - val_accuracy: 0.3093 Epoch 2/20 43/43 [==============================] - 56s 1s/step - loss: 1.3336 - accuracy: 0.3910 - val_loss: 1.1473 - val_accuracy: 0.4914 Epoch 3/20 43/43 [==============================] - 56s 1s/step - loss: 1.1729 - accuracy: 0.4864 - val_loss: 1.1883 - val_accuracy: 0.4381 Epoch 4/20 43/43 [==============================] - 56s 1s/step - loss: 1.1074 - accuracy: 0.5128 - val_loss: 1.1226 - val_accuracy: 0.4742 Epoch 5/20 43/43 [==============================] - 57s 1s/step - loss: 1.0963 - accuracy: 0.5187 - val_loss: 1.0793 - val_accuracy: 0.5137 Epoch 6/20 43/43 [==============================] - 57s 1s/step - loss: 1.0310 - accuracy: 0.5715 - val_loss: 1.0942 - val_accuracy: 0.5223 Epoch 7/20 43/43 [==============================] - 57s 1s/step - loss: 1.0509 - accuracy: 0.5378 - val_loss: 1.0264 - val_accuracy: 0.5258 Epoch 8/20 43/43 [==============================] - 57s 1s/step - loss: 0.9775 - accuracy: 0.5899 - val_loss: 1.0688 - val_accuracy: 0.5447 Epoch 9/20 43/43 [==============================] - 56s 1s/step - loss: 0.9366 - accuracy: 0.6097 - val_loss: 1.1517 - val_accuracy: 0.5069 Epoch 10/20 43/43 [==============================] - 57s 1s/step - loss: 0.9665 - accuracy: 0.5994 - val_loss: 1.0327 - val_accuracy: 0.5550 Epoch 11/20 43/43 [==============================] - 57s 1s/step - loss: 0.9566 - accuracy: 0.5913 - val_loss: 1.0880 - val_accuracy: 0.4897 Epoch 12/20 43/43 [==============================] - 57s 1s/step - loss: 0.8938 - accuracy: 0.6302 - val_loss: 0.9408 - val_accuracy: 0.6134 Epoch 13/20 43/43 [==============================] - 56s 1s/step - loss: 0.8981 - accuracy: 0.6185 - val_loss: 0.9396 - val_accuracy: 0.6048 Epoch 14/20 43/43 [==============================] - 56s 1s/step - loss: 0.7709 - accuracy: 0.6742 - val_loss: 0.9367 - val_accuracy: 0.6340 Epoch 15/20 43/43 [==============================] - 57s 1s/step - loss: 0.8399 - accuracy: 0.6508 - val_loss: 1.0593 - val_accuracy: 0.5550 Epoch 16/20 43/43 [==============================] - 59s 1s/step - loss: 0.8231 - accuracy: 0.6537 - val_loss: 1.2320 - val_accuracy: 0.5464 Epoch 17/20 43/43 [==============================] - 57s 1s/step - loss: 0.7389 - accuracy: 0.7131 - val_loss: 0.9851 - val_accuracy: 0.6100 Epoch 18/20 43/43 [==============================] - 57s 1s/step - loss: 0.6465 - accuracy: 0.7520 - val_loss: 0.8805 - val_accuracy: 0.6460 Epoch 19/20 43/43 [==============================] - 56s 1s/step - loss: 0.5828 - accuracy: 0.7667 - val_loss: 1.0385 - val_accuracy: 0.6082 Epoch 20/20 43/43 [==============================] - 56s 1s/step - loss: 0.5299 - accuracy: 0.7960 - val_loss: 0.9436 - val_accuracy: 0.6701
print_results(model,un_shuffled_train_data, un_shuffled_validation_data)
plot_results(history_relu_adam_20_epoch)
----->TRAIN
/usr/local/lib/python3.7/dist-packages/PIL/Image.py:960: UserWarning: Palette images with Transparency expressed in bytes should be converted to RGBA images "Palette images with Transparency expressed in bytes should be "
precision recall f1-score support
0 0.79 0.80 0.80 365
1 0.89 0.93 0.91 324
2 0.83 0.93 0.88 355
3 0.92 0.74 0.82 319
accuracy 0.85 1363
macro avg 0.86 0.85 0.85 1363
weighted avg 0.86 0.85 0.85 1363
---->Validation
/usr/local/lib/python3.7/dist-packages/PIL/Image.py:960: UserWarning: Palette images with Transparency expressed in bytes should be converted to RGBA images "Palette images with Transparency expressed in bytes should be "
precision recall f1-score support
0 0.60 0.60 0.60 156
1 0.69 0.76 0.72 138
2 0.63 0.72 0.67 152
3 0.83 0.60 0.70 136
accuracy 0.67 582
macro avg 0.69 0.67 0.67 582
weighted avg 0.68 0.67 0.67 582
Question) why do we use different epochs? Is it necessary to train our model on different epochs?
Epoch is the number of times that model will train through the entire training dataset. Training just one epoch can cause underfitting. So, we use multiple epochs. In some cases the learning algorithm may not be able to update its parameters with the dataset or update good. Hence, we do the training proccess in multiple iterations to learn better.
No, it is not necessary to use different epochs in all datsets. if we have enough data that we can train the data good, we do not need number of epochs and we can achieve the optimal result in just one iteration. But we always do not have enough data in real life.
Question) By which criteria we can detect overfitting?
One of them could be the accuracy. When the accuracy is getting higher but the accuracy on test dataset is gettin lower and lower, that is the point where overfitting is happening. Also, the other criterion is loss value. when after some epochs, the loss on train data set is getting lower and the loss on test daat set is getting higher and higher, in that point overfitting occured.
Question) Is it always useful to employ a lot of epochs to train the network?
Employing a lot of epochs may cause overfitting. This is beacause of the fac that our model learns the train data and even the noises very well that. Hence, it does not have good performance on the test data, although it has good performance and results on train data. One way to avoid overfitting is using early syopping methods. that is to stop training when the model is starting to overfitting. we can detect overfitting by the criterion I mentioned above.
from tensorflow.keras.callbacks import EarlyStopping
early_stopping = EarlyStopping(patience=8)
input = keras.layers.Input(shape=(100,100,1))
output = keras.layers.Flatten()(input)
output = keras.layers.Dense(4096, activation='relu')(output)
output = keras.layers.Dense(2048, activation='relu')(output)
output = keras.layers.Dense(4, activation='softmax')(output)
model = keras.models.Model(inputs=input, outputs=output)
model.compile(
optimizer=keras.optimizers.Adam(learning_rate=0.001),
loss='categorical_crossentropy',
metrics=['accuracy']
)
model.summary()
history_relu_adam_20_epoch_early_stopping = model.fit(un_shuffled_train_data, validation_data=un_shuffled_validation_data , epochs=20, callbacks=[early_stopping])
Model: "model_14"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
input_13 (InputLayer) [(None, 100, 100, 1)] 0
flatten_11 (Flatten) (None, 10000) 0
dense_35 (Dense) (None, 4096) 40964096
dense_36 (Dense) (None, 2048) 8390656
dense_37 (Dense) (None, 4) 8196
=================================================================
Total params: 49,362,948
Trainable params: 49,362,948
Non-trainable params: 0
_________________________________________________________________
Epoch 1/20
25/43 [================>.............] - ETA: 19s - loss: 38.6573 - accuracy: 0.2713
/usr/local/lib/python3.7/dist-packages/PIL/Image.py:960: UserWarning: Palette images with Transparency expressed in bytes should be converted to RGBA images "Palette images with Transparency expressed in bytes should be "
43/43 [==============================] - 63s 1s/step - loss: 23.9385 - accuracy: 0.3309 - val_loss: 4.5585 - val_accuracy: 0.2337 Epoch 2/20 43/43 [==============================] - 60s 1s/step - loss: 1.6888 - accuracy: 0.2443 - val_loss: 1.3838 - val_accuracy: 0.2629 Epoch 3/20 43/43 [==============================] - 60s 1s/step - loss: 1.3795 - accuracy: 0.2634 - val_loss: 1.3661 - val_accuracy: 0.2852 Epoch 4/20 43/43 [==============================] - 61s 1s/step - loss: 1.3757 - accuracy: 0.2685 - val_loss: 1.3829 - val_accuracy: 0.2629 Epoch 5/20 43/43 [==============================] - 60s 1s/step - loss: 1.3801 - accuracy: 0.2663 - val_loss: 1.3813 - val_accuracy: 0.2646 Epoch 6/20 43/43 [==============================] - 60s 1s/step - loss: 1.3792 - accuracy: 0.2700 - val_loss: 1.3805 - val_accuracy: 0.2629 Epoch 7/20 43/43 [==============================] - 60s 1s/step - loss: 1.3762 - accuracy: 0.2715 - val_loss: 1.3775 - val_accuracy: 0.2715 Epoch 8/20 43/43 [==============================] - 60s 1s/step - loss: 1.3650 - accuracy: 0.2788 - val_loss: 1.3641 - val_accuracy: 0.2835 Epoch 9/20 43/43 [==============================] - 61s 1s/step - loss: 1.3538 - accuracy: 0.2898 - val_loss: 1.3583 - val_accuracy: 0.2887 Epoch 10/20 43/43 [==============================] - 61s 1s/step - loss: 1.3461 - accuracy: 0.2920 - val_loss: 1.3619 - val_accuracy: 0.2887 Epoch 11/20 43/43 [==============================] - 60s 1s/step - loss: 1.3877 - accuracy: 0.2759 - val_loss: 1.3755 - val_accuracy: 0.2732 Epoch 12/20 43/43 [==============================] - 60s 1s/step - loss: 1.3644 - accuracy: 0.2120 - val_loss: 1.3592 - val_accuracy: 0.2869 Epoch 13/20 43/43 [==============================] - 60s 1s/step - loss: 1.3585 - accuracy: 0.2839 - val_loss: 1.3757 - val_accuracy: 0.2749 Epoch 14/20 43/43 [==============================] - 60s 1s/step - loss: 1.3604 - accuracy: 0.1768 - val_loss: 1.3621 - val_accuracy: 0.3007 Epoch 15/20 43/43 [==============================] - 60s 1s/step - loss: 1.3900 - accuracy: 0.2971 - val_loss: 1.3773 - val_accuracy: 0.2663 Epoch 16/20 43/43 [==============================] - 60s 1s/step - loss: 1.3654 - accuracy: 0.2759 - val_loss: 1.3762 - val_accuracy: 0.2784 Epoch 17/20 43/43 [==============================] - 60s 1s/step - loss: 1.3640 - accuracy: 0.2847 - val_loss: 1.3662 - val_accuracy: 0.2801
With assigning patience to 3, the overfitting happend in epoch #17.
print_results(model,un_shuffled_train_data, un_shuffled_validation_data)
plot_results(history_relu_adam_20_epoch_early_stopping)
----->TRAIN
/usr/local/lib/python3.7/dist-packages/PIL/Image.py:960: UserWarning: Palette images with Transparency expressed in bytes should be converted to RGBA images "Palette images with Transparency expressed in bytes should be " /usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1318: UndefinedMetricWarning: Precision and F-score are ill-defined and being set to 0.0 in labels with no predicted samples. Use `zero_division` parameter to control this behavior. _warn_prf(average, modifier, msg_start, len(result)) /usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1318: UndefinedMetricWarning: Precision and F-score are ill-defined and being set to 0.0 in labels with no predicted samples. Use `zero_division` parameter to control this behavior. _warn_prf(average, modifier, msg_start, len(result)) /usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1318: UndefinedMetricWarning: Precision and F-score are ill-defined and being set to 0.0 in labels with no predicted samples. Use `zero_division` parameter to control this behavior. _warn_prf(average, modifier, msg_start, len(result))
precision recall f1-score support
0 0.78 0.05 0.09 365
1 0.24 1.00 0.39 324
2 1.00 0.01 0.01 355
3 0.00 0.00 0.00 319
accuracy 0.25 1363
macro avg 0.51 0.26 0.12 1363
weighted avg 0.53 0.25 0.12 1363
---->Test
/usr/local/lib/python3.7/dist-packages/PIL/Image.py:960: UserWarning: Palette images with Transparency expressed in bytes should be converted to RGBA images "Palette images with Transparency expressed in bytes should be " /usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1318: UndefinedMetricWarning: Precision and F-score are ill-defined and being set to 0.0 in labels with no predicted samples. Use `zero_division` parameter to control this behavior. _warn_prf(average, modifier, msg_start, len(result)) /usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1318: UndefinedMetricWarning: Precision and F-score are ill-defined and being set to 0.0 in labels with no predicted samples. Use `zero_division` parameter to control this behavior. _warn_prf(average, modifier, msg_start, len(result)) /usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1318: UndefinedMetricWarning: Precision and F-score are ill-defined and being set to 0.0 in labels with no predicted samples. Use `zero_division` parameter to control this behavior. _warn_prf(average, modifier, msg_start, len(result))
precision recall f1-score support
0 0.82 0.06 0.11 156
1 0.24 1.00 0.39 138
2 1.00 0.01 0.01 152
3 0.00 0.00 0.00 136
accuracy 0.25 582
macro avg 0.52 0.27 0.13 582
weighted avg 0.54 0.25 0.12 582
Question: why MSE is not good for classification problems? Where we use it?
Mean squared error (MSE) is the most commonly used loss function for regression. Using MSE means that we assume that the underlying data has been generated from a normal distribution which in classification problems it is not always true. Also, as we are dealing with a classification problem the gradian of MSE would be very small so the updates of weights would be very small, so the learning process would eventually stop.
input = keras.layers.Input(shape=(100,100,1))
output = keras.layers.Flatten()(input)
output = keras.layers.Dense(2048, activation='relu')(output)
output = keras.layers.Dense(1024, activation='relu')(output)
output = keras.layers.Dense(4, activation='softmax')(output)
model = keras.models.Model(inputs=input, outputs=output)
model.compile(
optimizer=keras.optimizers.Adam(learning_rate=0.001),
loss='mean_squared_error',
metrics=['accuracy']
)
model.summary()
history_relu_adam_mse = model.fit(train_data, validation_data=validation_data , epochs=10)
Model: "model_43"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
input_44 (InputLayer) [(None, 100, 100, 1)] 0
flatten_43 (Flatten) (None, 10000) 0
dense_130 (Dense) (None, 2048) 20482048
dense_131 (Dense) (None, 1024) 2098176
dense_132 (Dense) (None, 4) 4100
=================================================================
Total params: 22,584,324
Trainable params: 22,584,324
Non-trainable params: 0
_________________________________________________________________
Epoch 1/20
8/43 [====>.........................] - ETA: 26s - loss: 0.3768 - accuracy: 0.1914
/usr/local/lib/python3.7/dist-packages/PIL/Image.py:960: UserWarning: Palette images with Transparency expressed in bytes should be converted to RGBA images "Palette images with Transparency expressed in bytes should be "
43/43 [==============================] - 57s 1s/step - loss: 0.3628 - accuracy: 0.2641 - val_loss: 0.3660 - val_accuracy: 0.2680 Epoch 2/20 43/43 [==============================] - 55s 1s/step - loss: 0.3661 - accuracy: 0.2678 - val_loss: 0.3660 - val_accuracy: 0.2680 Epoch 3/20 43/43 [==============================] - 55s 1s/step - loss: 0.3661 - accuracy: 0.2678 - val_loss: 0.3660 - val_accuracy: 0.2680 Epoch 4/20 43/43 [==============================] - 55s 1s/step - loss: 0.3661 - accuracy: 0.2678 - val_loss: 0.3660 - val_accuracy: 0.2680 Epoch 5/20 43/43 [==============================] - 57s 1s/step - loss: 0.3661 - accuracy: 0.2678 - val_loss: 0.3660 - val_accuracy: 0.2680 Epoch 6/20 43/43 [==============================] - 56s 1s/step - loss: 0.3661 - accuracy: 0.2678 - val_loss: 0.3660 - val_accuracy: 0.2680 Epoch 7/20 43/43 [==============================] - 55s 1s/step - loss: 0.3661 - accuracy: 0.2678 - val_loss: 0.3660 - val_accuracy: 0.2680 Epoch 8/20 43/43 [==============================] - 56s 1s/step - loss: 0.3661 - accuracy: 0.2678 - val_loss: 0.3660 - val_accuracy: 0.2680 Epoch 9/20 43/43 [==============================] - 55s 1s/step - loss: 0.3661 - accuracy: 0.2678 - val_loss: 0.3660 - val_accuracy: 0.2680 Epoch 10/20 43/43 [==============================] - 55s 1s/step - loss: 0.3661 - accuracy: 0.2678 - val_loss: 0.3660 - val_accuracy: 0.2680 Epoch 11/20 43/43 [==============================] - 55s 1s/step - loss: 0.3661 - accuracy: 0.2678 - val_loss: 0.3660 - val_accuracy: 0.2680 Epoch 12/20 43/43 [==============================] - 55s 1s/step - loss: 0.3661 - accuracy: 0.2678 - val_loss: 0.3660 - val_accuracy: 0.2680 Epoch 13/20 43/43 [==============================] - 55s 1s/step - loss: 0.3661 - accuracy: 0.2678 - val_loss: 0.3660 - val_accuracy: 0.2680 Epoch 14/20 43/43 [==============================] - 55s 1s/step - loss: 0.3661 - accuracy: 0.2678 - val_loss: 0.3660 - val_accuracy: 0.2680 Epoch 15/20 43/43 [==============================] - 55s 1s/step - loss: 0.3661 - accuracy: 0.2678 - val_loss: 0.3660 - val_accuracy: 0.2680 Epoch 16/20 43/43 [==============================] - 55s 1s/step - loss: 0.3661 - accuracy: 0.2678 - val_loss: 0.3660 - val_accuracy: 0.2680 Epoch 17/20 43/43 [==============================] - 55s 1s/step - loss: 0.3661 - accuracy: 0.2678 - val_loss: 0.3660 - val_accuracy: 0.2680 Epoch 18/20 43/43 [==============================] - 55s 1s/step - loss: 0.3661 - accuracy: 0.2678 - val_loss: 0.3660 - val_accuracy: 0.2680 Epoch 19/20 43/43 [==============================] - 55s 1s/step - loss: 0.3661 - accuracy: 0.2678 - val_loss: 0.3660 - val_accuracy: 0.2680 Epoch 20/20 43/43 [==============================] - 55s 1s/step - loss: 0.3661 - accuracy: 0.2678 - val_loss: 0.3660 - val_accuracy: 0.2680
print_results(model,un_shuffled_train_data, un_shuffled_validation_data )
plot_results(history_relu_adam_mse)
----->TRAIN
/usr/local/lib/python3.7/dist-packages/PIL/Image.py:960: UserWarning: Palette images with Transparency expressed in bytes should be converted to RGBA images "Palette images with Transparency expressed in bytes should be " /usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1308: UndefinedMetricWarning: Precision and F-score are ill-defined and being set to 0.0 in labels with no predicted samples. Use `zero_division` parameter to control this behavior. _warn_prf(average, modifier, msg_start, len(result)) /usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1308: UndefinedMetricWarning: Precision and F-score are ill-defined and being set to 0.0 in labels with no predicted samples. Use `zero_division` parameter to control this behavior. _warn_prf(average, modifier, msg_start, len(result)) /usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1308: UndefinedMetricWarning: Precision and F-score are ill-defined and being set to 0.0 in labels with no predicted samples. Use `zero_division` parameter to control this behavior. _warn_prf(average, modifier, msg_start, len(result))
precision recall f1-score support
0 0.27 1.00 0.42 365
1 0.00 0.00 0.00 324
2 0.00 0.00 0.00 355
3 0.00 0.00 0.00 319
accuracy 0.27 1363
macro avg 0.07 0.25 0.11 1363
weighted avg 0.07 0.27 0.11 1363
---->Validation
/usr/local/lib/python3.7/dist-packages/PIL/Image.py:960: UserWarning: Palette images with Transparency expressed in bytes should be converted to RGBA images "Palette images with Transparency expressed in bytes should be "
precision recall f1-score support
0 0.27 1.00 0.42 156
1 0.00 0.00 0.00 138
2 0.00 0.00 0.00 152
3 0.00 0.00 0.00 136
accuracy 0.27 582
macro avg 0.07 0.25 0.11 582
weighted avg 0.07 0.27 0.11 582
/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1308: UndefinedMetricWarning: Precision and F-score are ill-defined and being set to 0.0 in labels with no predicted samples. Use `zero_division` parameter to control this behavior. _warn_prf(average, modifier, msg_start, len(result)) /usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1308: UndefinedMetricWarning: Precision and F-score are ill-defined and being set to 0.0 in labels with no predicted samples. Use `zero_division` parameter to control this behavior. _warn_prf(average, modifier, msg_start, len(result)) /usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1308: UndefinedMetricWarning: Precision and F-score are ill-defined and being set to 0.0 in labels with no predicted samples. Use `zero_division` parameter to control this behavior. _warn_prf(average, modifier, msg_start, len(result))
input = keras.layers.Input(shape=(100,100,1))
output = keras.layers.Flatten()(input)
output = keras.layers.Dense(2048, activation='relu', kernel_regularizer=keras.regularizers.l2(l2=0.0001))(output)
output = keras.layers.Dense(1024, activation='relu', kernel_regularizer=keras.regularizers.l2(l2=0.0001))(output)
output = keras.layers.Dense(4, activation='softmax', kernel_regularizer=keras.regularizers.l2(l2=0.0001))(output)
model = keras.models.Model(inputs=input, outputs=output)
model.compile(
optimizer=keras.optimizers.Adam(learning_rate=0.001),
loss='categorical_crossentropy',
metrics=['accuracy']
)
model.summary()
history_relu_adam_L2 = model.fit(train_data, validation_data=validation_data , epochs=20)
Model: "model_44"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
input_45 (InputLayer) [(None, 100, 100, 1)] 0
flatten_44 (Flatten) (None, 10000) 0
dense_133 (Dense) (None, 2048) 20482048
dense_134 (Dense) (None, 1024) 2098176
dense_135 (Dense) (None, 4) 4100
=================================================================
Total params: 22,584,324
Trainable params: 22,584,324
Non-trainable params: 0
_________________________________________________________________
Epoch 1/20
14/43 [========>.....................] - ETA: 27s - loss: 12.3197 - accuracy: 0.2589
/usr/local/lib/python3.7/dist-packages/PIL/Image.py:960: UserWarning: Palette images with Transparency expressed in bytes should be converted to RGBA images "Palette images with Transparency expressed in bytes should be "
43/43 [==============================] - 61s 1s/step - loss: 5.5275 - accuracy: 0.2927 - val_loss: 1.7086 - val_accuracy: 0.4210 Epoch 2/20 43/43 [==============================] - 59s 1s/step - loss: 1.6468 - accuracy: 0.4028 - val_loss: 1.5723 - val_accuracy: 0.4124 Epoch 3/20 43/43 [==============================] - 59s 1s/step - loss: 1.5363 - accuracy: 0.4674 - val_loss: 1.6382 - val_accuracy: 0.4158 Epoch 4/20 43/43 [==============================] - 59s 1s/step - loss: 1.5016 - accuracy: 0.4652 - val_loss: 1.5228 - val_accuracy: 0.4536 Epoch 5/20 43/43 [==============================] - 59s 1s/step - loss: 1.4985 - accuracy: 0.4468 - val_loss: 1.4220 - val_accuracy: 0.4450 Epoch 6/20 43/43 [==============================] - 59s 1s/step - loss: 1.4068 - accuracy: 0.5150 - val_loss: 1.5523 - val_accuracy: 0.4588 Epoch 7/20 43/43 [==============================] - 61s 1s/step - loss: 1.3411 - accuracy: 0.5459 - val_loss: 1.3092 - val_accuracy: 0.5241 Epoch 8/20 43/43 [==============================] - 60s 1s/step - loss: 1.2501 - accuracy: 0.5686 - val_loss: 1.3316 - val_accuracy: 0.5258 Epoch 9/20 43/43 [==============================] - 59s 1s/step - loss: 1.2357 - accuracy: 0.5759 - val_loss: 1.3974 - val_accuracy: 0.4777 Epoch 10/20 43/43 [==============================] - 60s 1s/step - loss: 1.1835 - accuracy: 0.5994 - val_loss: 1.2442 - val_accuracy: 0.5636 Epoch 11/20 43/43 [==============================] - 60s 1s/step - loss: 1.1390 - accuracy: 0.6075 - val_loss: 1.2517 - val_accuracy: 0.5447 Epoch 12/20 43/43 [==============================] - 60s 1s/step - loss: 1.0294 - accuracy: 0.6779 - val_loss: 1.1953 - val_accuracy: 0.5928 Epoch 13/20 43/43 [==============================] - 59s 1s/step - loss: 1.0170 - accuracy: 0.6742 - val_loss: 1.1815 - val_accuracy: 0.5997 Epoch 14/20 43/43 [==============================] - 60s 1s/step - loss: 1.0017 - accuracy: 0.6742 - val_loss: 1.2325 - val_accuracy: 0.5739 Epoch 15/20 43/43 [==============================] - 60s 1s/step - loss: 0.9363 - accuracy: 0.7124 - val_loss: 1.1591 - val_accuracy: 0.6237 Epoch 16/20 43/43 [==============================] - 60s 1s/step - loss: 0.8611 - accuracy: 0.7469 - val_loss: 1.0984 - val_accuracy: 0.6375 Epoch 17/20 43/43 [==============================] - 60s 1s/step - loss: 0.7846 - accuracy: 0.7711 - val_loss: 1.0426 - val_accuracy: 0.6976 Epoch 18/20 43/43 [==============================] - 59s 1s/step - loss: 0.6841 - accuracy: 0.8048 - val_loss: 1.0948 - val_accuracy: 0.6581 Epoch 19/20 43/43 [==============================] - 59s 1s/step - loss: 0.7681 - accuracy: 0.7858 - val_loss: 1.1312 - val_accuracy: 0.6392 Epoch 20/20 43/43 [==============================] - 59s 1s/step - loss: 0.6987 - accuracy: 0.8078 - val_loss: 1.2599 - val_accuracy: 0.6220
print_results(model,un_shuffled_train_data, un_shuffled_validation_data )
plot_results(history_relu_adam_L2)
----->TRAIN
/usr/local/lib/python3.7/dist-packages/PIL/Image.py:960: UserWarning: Palette images with Transparency expressed in bytes should be converted to RGBA images "Palette images with Transparency expressed in bytes should be "
precision recall f1-score support
0 0.62 0.95 0.75 365
1 0.88 0.92 0.90 324
2 0.86 0.95 0.90 355
3 1.00 0.24 0.38 319
accuracy 0.78 1363
macro avg 0.84 0.76 0.73 1363
weighted avg 0.83 0.78 0.74 1363
---->Validation
/usr/local/lib/python3.7/dist-packages/PIL/Image.py:960: UserWarning: Palette images with Transparency expressed in bytes should be converted to RGBA images "Palette images with Transparency expressed in bytes should be "
precision recall f1-score support
0 0.52 0.76 0.62 156
1 0.71 0.71 0.71 138
2 0.64 0.80 0.71 152
3 0.92 0.18 0.30 136
accuracy 0.62 582
macro avg 0.70 0.61 0.58 582
weighted avg 0.69 0.62 0.59 582
We need dropout to prevent overfitting. Dropout refers to ignoring neurons during the training phase of certain set of neurons which is chosen at random. Ignoring means these units are not considered during a particular forward or backward pass. At each training stage, individual nodes are either dropped out of the net with probability 1-p or kept with probability p, so that a reduced network is left.
input = keras.layers.Input(shape=(100,100,1))
output = keras.layers.Flatten()(input)
output = keras.layers.Dense(2048, activation='relu')(output)
output = layers.Dropout(0.1)(output)
output = keras.layers.Dense(1024, activation='relu')(output)
output = layers.Dropout(0.1)(output)
output = keras.layers.Dense(4, activation='softmax')(output)
model = keras.models.Model(inputs=input, outputs=output)
model.compile(
optimizer=keras.optimizers.Adam(learning_rate=0.001),
loss='categorical_crossentropy',
metrics=['accuracy']
)
model.summary()
history_relu_adam_dropout = model.fit(train_data, validation_data=validation_data , epochs=20)
Model: "model_46"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
input_47 (InputLayer) [(None, 100, 100, 1)] 0
flatten_46 (Flatten) (None, 10000) 0
dense_139 (Dense) (None, 2048) 20482048
dropout_2 (Dropout) (None, 2048) 0
dense_140 (Dense) (None, 1024) 2098176
dropout_3 (Dropout) (None, 1024) 0
dense_141 (Dense) (None, 4) 4100
=================================================================
Total params: 22,584,324
Trainable params: 22,584,324
Non-trainable params: 0
_________________________________________________________________
Epoch 1/20
11/43 [======>.......................] - ETA: 33s - loss: 16.7714 - accuracy: 0.2528
/usr/local/lib/python3.7/dist-packages/PIL/Image.py:960: UserWarning: Palette images with Transparency expressed in bytes should be converted to RGBA images "Palette images with Transparency expressed in bytes should be "
43/43 [==============================] - 59s 1s/step - loss: 5.8466 - accuracy: 0.2935 - val_loss: 1.2729 - val_accuracy: 0.4141 Epoch 2/20 43/43 [==============================] - 57s 1s/step - loss: 1.3269 - accuracy: 0.4087 - val_loss: 1.2961 - val_accuracy: 0.3969 Epoch 3/20 43/43 [==============================] - 57s 1s/step - loss: 1.2296 - accuracy: 0.4351 - val_loss: 1.1849 - val_accuracy: 0.4622 Epoch 4/20 43/43 [==============================] - 57s 1s/step - loss: 1.1879 - accuracy: 0.4475 - val_loss: 1.1383 - val_accuracy: 0.4914 Epoch 5/20 43/43 [==============================] - 56s 1s/step - loss: 1.1546 - accuracy: 0.4762 - val_loss: 1.1032 - val_accuracy: 0.5189 Epoch 6/20 43/43 [==============================] - 56s 1s/step - loss: 1.1473 - accuracy: 0.4806 - val_loss: 1.1456 - val_accuracy: 0.4742 Epoch 7/20 43/43 [==============================] - 56s 1s/step - loss: 1.1283 - accuracy: 0.4923 - val_loss: 1.1081 - val_accuracy: 0.5000 Epoch 8/20 43/43 [==============================] - 56s 1s/step - loss: 1.1004 - accuracy: 0.5048 - val_loss: 1.1549 - val_accuracy: 0.4485 Epoch 9/20 43/43 [==============================] - 56s 1s/step - loss: 1.0920 - accuracy: 0.5268 - val_loss: 1.1561 - val_accuracy: 0.4725 Epoch 10/20 43/43 [==============================] - 56s 1s/step - loss: 1.0978 - accuracy: 0.5202 - val_loss: 1.1514 - val_accuracy: 0.4897 Epoch 11/20 43/43 [==============================] - 56s 1s/step - loss: 1.0242 - accuracy: 0.5576 - val_loss: 1.1214 - val_accuracy: 0.5172 Epoch 12/20 43/43 [==============================] - 56s 1s/step - loss: 0.9478 - accuracy: 0.5979 - val_loss: 1.0607 - val_accuracy: 0.5481 Epoch 13/20 43/43 [==============================] - 56s 1s/step - loss: 0.9532 - accuracy: 0.5701 - val_loss: 1.1497 - val_accuracy: 0.4828 Epoch 14/20 43/43 [==============================] - 56s 1s/step - loss: 0.9516 - accuracy: 0.6031 - val_loss: 1.0521 - val_accuracy: 0.5722 Epoch 15/20 43/43 [==============================] - 56s 1s/step - loss: 0.9527 - accuracy: 0.5730 - val_loss: 1.0687 - val_accuracy: 0.5344 Epoch 16/20 43/43 [==============================] - 56s 1s/step - loss: 0.9426 - accuracy: 0.5965 - val_loss: 0.9876 - val_accuracy: 0.6031 Epoch 17/20 43/43 [==============================] - 56s 1s/step - loss: 0.8694 - accuracy: 0.6178 - val_loss: 1.1571 - val_accuracy: 0.4966 Epoch 18/20 43/43 [==============================] - 56s 1s/step - loss: 0.9932 - accuracy: 0.5422 - val_loss: 1.0986 - val_accuracy: 0.5103 Epoch 19/20 43/43 [==============================] - 56s 1s/step - loss: 0.9236 - accuracy: 0.5767 - val_loss: 1.3443 - val_accuracy: 0.4794 Epoch 20/20 43/43 [==============================] - 56s 1s/step - loss: 0.8604 - accuracy: 0.6376 - val_loss: 1.0260 - val_accuracy: 0.5790
print_results(model,un_shuffled_train_data, un_shuffled_validation_data )
plot_results(history_relu_adam_dropout)
----->TRAIN
/usr/local/lib/python3.7/dist-packages/PIL/Image.py:960: UserWarning: Palette images with Transparency expressed in bytes should be converted to RGBA images "Palette images with Transparency expressed in bytes should be "
precision recall f1-score support
0 0.57 0.67 0.61 365
1 0.76 0.76 0.76 324
2 0.91 0.72 0.81 355
3 0.71 0.72 0.72 319
accuracy 0.72 1363
macro avg 0.74 0.72 0.72 1363
weighted avg 0.74 0.72 0.72 1363
---->Validation
/usr/local/lib/python3.7/dist-packages/PIL/Image.py:960: UserWarning: Palette images with Transparency expressed in bytes should be converted to RGBA images "Palette images with Transparency expressed in bytes should be "
precision recall f1-score support
0 0.46 0.60 0.52 156
1 0.55 0.62 0.59 138
2 0.82 0.53 0.64 152
3 0.62 0.57 0.60 136
accuracy 0.58 582
macro avg 0.61 0.58 0.59 582
weighted avg 0.61 0.58 0.58 582
As overfitting does not occure till epoch 20 using regularization methods does not effect drastically on this problem. ALthough te l2 method has givven us good results than dropout method.
input = keras.layers.Input(shape=(100,100,1))
output = keras.layers.Flatten()(input)
output = keras.layers.Dense(2048, activation='relu', kernel_regularizer=keras.regularizers.l2(l2=0.0001))(output)
output = layers.Dropout(0.1)(output)
output = keras.layers.Dense(1024, activation='relu', kernel_regularizer=keras.regularizers.l2(l2=0.0001))(output)
output = layers.Dropout(0.1)(output)
output = keras.layers.Dense(4, activation='softmax', kernel_regularizer=keras.regularizers.l2(l2=0.0001))(output)
model = keras.models.Model(inputs=input, outputs=output)
model.compile(
optimizer=keras.optimizers.Adam(learning_rate=0.001),
loss='categorical_crossentropy',
metrics=['accuracy']
)
model.summary()
history_relu_adam_dropout_l2 = model.fit(train_data, validation_data=validation_data , epochs=20)
Model: "model_48"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
input_49 (InputLayer) [(None, 100, 100, 1)] 0
flatten_48 (Flatten) (None, 10000) 0
dense_145 (Dense) (None, 2048) 20482048
dropout_6 (Dropout) (None, 2048) 0
dense_146 (Dense) (None, 1024) 2098176
dropout_7 (Dropout) (None, 1024) 0
dense_147 (Dense) (None, 4) 4100
=================================================================
Total params: 22,584,324
Trainable params: 22,584,324
Non-trainable params: 0
_________________________________________________________________
Epoch 1/20
3/43 [=>............................] - ETA: 34s - loss: 32.6859 - accuracy: 0.1562
/usr/local/lib/python3.7/dist-packages/PIL/Image.py:960: UserWarning: Palette images with Transparency expressed in bytes should be converted to RGBA images "Palette images with Transparency expressed in bytes should be "
43/43 [==============================] - 63s 1s/step - loss: 7.2531 - accuracy: 0.2568 - val_loss: 1.7316 - val_accuracy: 0.3832 Epoch 2/20 43/43 [==============================] - 61s 1s/step - loss: 1.7489 - accuracy: 0.3573 - val_loss: 1.6491 - val_accuracy: 0.4072 Epoch 3/20 43/43 [==============================] - 61s 1s/step - loss: 1.6134 - accuracy: 0.4175 - val_loss: 1.5204 - val_accuracy: 0.4674 Epoch 4/20 43/43 [==============================] - 61s 1s/step - loss: 1.5443 - accuracy: 0.4666 - val_loss: 1.5458 - val_accuracy: 0.4639 Epoch 5/20 43/43 [==============================] - 62s 1s/step - loss: 1.5046 - accuracy: 0.4894 - val_loss: 1.4533 - val_accuracy: 0.4674 Epoch 6/20 43/43 [==============================] - 61s 1s/step - loss: 1.4707 - accuracy: 0.4769 - val_loss: 1.4281 - val_accuracy: 0.4845 Epoch 7/20 43/43 [==============================] - 61s 1s/step - loss: 1.4307 - accuracy: 0.5077 - val_loss: 1.4151 - val_accuracy: 0.5155 Epoch 8/20 43/43 [==============================] - 62s 1s/step - loss: 1.3656 - accuracy: 0.5547 - val_loss: 1.3902 - val_accuracy: 0.5137 Epoch 9/20 43/43 [==============================] - 62s 1s/step - loss: 1.3143 - accuracy: 0.5605 - val_loss: 1.3142 - val_accuracy: 0.5378 Epoch 10/20 43/43 [==============================] - 60s 1s/step - loss: 1.3406 - accuracy: 0.5503 - val_loss: 1.3411 - val_accuracy: 0.5361 Epoch 11/20 43/43 [==============================] - 60s 1s/step - loss: 1.2736 - accuracy: 0.5679 - val_loss: 1.3236 - val_accuracy: 0.5361 Epoch 12/20 43/43 [==============================] - 60s 1s/step - loss: 1.2645 - accuracy: 0.5393 - val_loss: 1.2892 - val_accuracy: 0.5515 Epoch 13/20 43/43 [==============================] - 59s 1s/step - loss: 1.2302 - accuracy: 0.5803 - val_loss: 1.2946 - val_accuracy: 0.5533 Epoch 14/20 43/43 [==============================] - 60s 1s/step - loss: 1.1819 - accuracy: 0.5979 - val_loss: 1.3745 - val_accuracy: 0.5069 Epoch 15/20 43/43 [==============================] - 60s 1s/step - loss: 1.1491 - accuracy: 0.6016 - val_loss: 1.3391 - val_accuracy: 0.5172 Epoch 16/20 43/43 [==============================] - 60s 1s/step - loss: 1.0824 - accuracy: 0.6383 - val_loss: 1.3231 - val_accuracy: 0.5825 Epoch 17/20 43/43 [==============================] - 60s 1s/step - loss: 1.0337 - accuracy: 0.6376 - val_loss: 1.2117 - val_accuracy: 0.6117 Epoch 18/20 43/43 [==============================] - 60s 1s/step - loss: 1.0430 - accuracy: 0.6339 - val_loss: 1.2696 - val_accuracy: 0.5034 Epoch 19/20 43/43 [==============================] - 60s 1s/step - loss: 1.0904 - accuracy: 0.6090 - val_loss: 1.2620 - val_accuracy: 0.5687 Epoch 20/20 43/43 [==============================] - 61s 1s/step - loss: 1.0615 - accuracy: 0.6258 - val_loss: 1.3106 - val_accuracy: 0.5326
print_results(model,un_shuffled_train_data, un_shuffled_validation_data )
plot_results(history_relu_adam_dropout_l2)
----->TRAIN
/usr/local/lib/python3.7/dist-packages/PIL/Image.py:960: UserWarning: Palette images with Transparency expressed in bytes should be converted to RGBA images "Palette images with Transparency expressed in bytes should be "
precision recall f1-score support
0 0.60 0.54 0.57 365
1 0.56 0.90 0.69 324
2 0.94 0.57 0.71 355
3 0.69 0.66 0.67 319
accuracy 0.66 1363
macro avg 0.70 0.66 0.66 1363
weighted avg 0.70 0.66 0.66 1363
---->Validation
/usr/local/lib/python3.7/dist-packages/PIL/Image.py:960: UserWarning: Palette images with Transparency expressed in bytes should be converted to RGBA images "Palette images with Transparency expressed in bytes should be "
precision recall f1-score support
0 0.47 0.48 0.48 156
1 0.48 0.78 0.59 138
2 0.72 0.38 0.50 152
3 0.60 0.51 0.55 136
accuracy 0.53 582
macro avg 0.57 0.54 0.53 582
weighted avg 0.57 0.53 0.53 582
input = keras.layers.Input(shape=(100,100,1))
output = keras.layers.Flatten()(input)
output = keras.layers.Dense(4096, activation='relu', kernel_regularizer=keras.regularizers.l2(l2=0.0001))(output)
output = layers.Dropout(0.1)(output)
output = keras.layers.Dense(2048, activation='relu', kernel_regularizer=keras.regularizers.l2(l2=0.0001))(output)
output = layers.Dropout(0.1)(output)
output = keras.layers.Dense(4, activation='softmax', kernel_regularizer=keras.regularizers.l2(l2=0.0001))(output)
model = keras.models.Model(inputs=input, outputs=output)
model.compile(
optimizer=keras.optimizers.Adam(learning_rate=0.001),
loss='categorical_crossentropy',
metrics=['accuracy']
)
model.summary()
history_relu_adam_dropout_l2_4096 = model.fit(train_data, validation_data=validation_data , epochs=20)
Model: "model_49"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
input_50 (InputLayer) [(None, 100, 100, 1)] 0
flatten_49 (Flatten) (None, 10000) 0
dense_148 (Dense) (None, 4096) 40964096
dropout_8 (Dropout) (None, 4096) 0
dense_149 (Dense) (None, 2048) 8390656
dropout_9 (Dropout) (None, 2048) 0
dense_150 (Dense) (None, 4) 8196
=================================================================
Total params: 49,362,948
Trainable params: 49,362,948
Non-trainable params: 0
_________________________________________________________________
Epoch 1/20
13/43 [========>.....................] - ETA: 33s - loss: 24.0378 - accuracy: 0.2380
/usr/local/lib/python3.7/dist-packages/PIL/Image.py:960: UserWarning: Palette images with Transparency expressed in bytes should be converted to RGBA images "Palette images with Transparency expressed in bytes should be "
43/43 [==============================] - 72s 2s/step - loss: 8.9614 - accuracy: 0.2986 - val_loss: 2.0536 - val_accuracy: 0.3625 Epoch 2/20 43/43 [==============================] - 71s 2s/step - loss: 2.0044 - accuracy: 0.3969 - val_loss: 1.8581 - val_accuracy: 0.4708 Epoch 3/20 43/43 [==============================] - 71s 2s/step - loss: 1.8723 - accuracy: 0.4307 - val_loss: 1.8447 - val_accuracy: 0.4124 Epoch 4/20 43/43 [==============================] - 70s 2s/step - loss: 1.7853 - accuracy: 0.4710 - val_loss: 1.7692 - val_accuracy: 0.4708 Epoch 5/20 43/43 [==============================] - 70s 2s/step - loss: 1.7159 - accuracy: 0.4615 - val_loss: 1.7614 - val_accuracy: 0.4467 Epoch 6/20 43/43 [==============================] - 70s 2s/step - loss: 1.6243 - accuracy: 0.5004 - val_loss: 1.5680 - val_accuracy: 0.5223 Epoch 7/20 43/43 [==============================] - 70s 2s/step - loss: 1.6105 - accuracy: 0.5011 - val_loss: 1.6850 - val_accuracy: 0.4674 Epoch 8/20 43/43 [==============================] - 70s 2s/step - loss: 1.5179 - accuracy: 0.5290 - val_loss: 1.6800 - val_accuracy: 0.4983 Epoch 9/20 43/43 [==============================] - 71s 2s/step - loss: 1.5106 - accuracy: 0.5216 - val_loss: 1.5597 - val_accuracy: 0.4863 Epoch 10/20 43/43 [==============================] - 70s 2s/step - loss: 1.4428 - accuracy: 0.5400 - val_loss: 1.4737 - val_accuracy: 0.5172 Epoch 11/20 43/43 [==============================] - 71s 2s/step - loss: 1.4426 - accuracy: 0.5422 - val_loss: 1.5846 - val_accuracy: 0.4467 Epoch 12/20 43/43 [==============================] - 73s 2s/step - loss: 1.4508 - accuracy: 0.5275 - val_loss: 1.4674 - val_accuracy: 0.5189 Epoch 13/20 43/43 [==============================] - 73s 2s/step - loss: 1.4183 - accuracy: 0.5253 - val_loss: 1.4783 - val_accuracy: 0.4931 Epoch 14/20 43/43 [==============================] - 71s 2s/step - loss: 1.3607 - accuracy: 0.5415 - val_loss: 1.4036 - val_accuracy: 0.5481 Epoch 15/20 43/43 [==============================] - 70s 2s/step - loss: 1.3588 - accuracy: 0.5194 - val_loss: 1.5605 - val_accuracy: 0.3986 Epoch 16/20 43/43 [==============================] - 70s 2s/step - loss: 1.3881 - accuracy: 0.5121 - val_loss: 1.4277 - val_accuracy: 0.5361 Epoch 17/20 43/43 [==============================] - 70s 2s/step - loss: 1.2679 - accuracy: 0.5737 - val_loss: 1.3679 - val_accuracy: 0.5292 Epoch 18/20 43/43 [==============================] - 70s 2s/step - loss: 1.2895 - accuracy: 0.5605 - val_loss: 1.4227 - val_accuracy: 0.4863 Epoch 19/20 43/43 [==============================] - 70s 2s/step - loss: 1.3432 - accuracy: 0.4974 - val_loss: 1.5484 - val_accuracy: 0.3729 Epoch 20/20 43/43 [==============================] - 70s 2s/step - loss: 1.3344 - accuracy: 0.4989 - val_loss: 1.4440 - val_accuracy: 0.5052
data_generator = preprocessing.image.ImageDataGenerator(
rescale=1/255.0)
un_shuffled_all_train_data = data_generator.flow_from_directory(
"/content/dataset/train",
target_size=(100, 100),
color_mode="grayscale",
class_mode='categorical',
batch_size=32,
shuffle = False
)
all_train_data = data_generator.flow_from_directory(
"/content/dataset/train",
target_size=(100, 100),
color_mode="grayscale",
class_mode='categorical',
batch_size=32,
)
un_shuffled_test_data = data_generator.flow_from_directory(
"/content/dataset/test",
target_size=(100, 100),
color_mode="grayscale",
class_mode='categorical',
batch_size=32,
shuffle = False
)
test_data = data_generator.flow_from_directory(
"/content/dataset/test",
target_size=(100, 100),
color_mode="grayscale",
class_mode='categorical',
batch_size=32,
)
Found 1945 images belonging to 4 classes. Found 1945 images belonging to 4 classes. Found 833 images belonging to 4 classes. Found 833 images belonging to 4 classes.
input = keras.layers.Input(shape=(100,100,1))
output = keras.layers.Flatten()(input)
output = keras.layers.Dense(4096, activation='relu')(output)
output = keras.layers.Dense(2048, activation='relu')(output)
output = keras.layers.Dense(4, activation='softmax')(output)
model = keras.models.Model(inputs=input, outputs=output)
model.compile(
optimizer=keras.optimizers.Adam(learning_rate=0.001),
loss='categorical_crossentropy',
metrics=['accuracy']
)
model.summary()
history_relu_adam_dropout_l2_test = model.fit(train_data, validation_data=test_data , epochs=20)
Model: "model_3"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
input_4 (InputLayer) [(None, 100, 100, 1)] 0
flatten_3 (Flatten) (None, 10000) 0
dense_9 (Dense) (None, 4096) 40964096
dense_10 (Dense) (None, 2048) 8390656
dense_11 (Dense) (None, 4) 8196
=================================================================
Total params: 49,362,948
Trainable params: 49,362,948
Non-trainable params: 0
_________________________________________________________________
Epoch 1/20
4/43 [=>............................] - ETA: 39s - loss: 36.9174 - accuracy: 0.3913
/usr/local/lib/python3.7/dist-packages/PIL/Image.py:960: UserWarning: Palette images with Transparency expressed in bytes should be converted to RGBA images "Palette images with Transparency expressed in bytes should be "
43/43 [==============================] - 91s 2s/step - loss: 7.1182 - accuracy: 0.3382 - val_loss: 1.2292 - val_accuracy: 0.3866 Epoch 2/20 43/43 [==============================] - 72s 2s/step - loss: 1.2113 - accuracy: 0.4417 - val_loss: 1.4607 - val_accuracy: 0.3385 Epoch 3/20 43/43 [==============================] - 89s 2s/step - loss: 1.1809 - accuracy: 0.4409 - val_loss: 1.2684 - val_accuracy: 0.3962 Epoch 4/20 43/43 [==============================] - 72s 2s/step - loss: 1.1640 - accuracy: 0.4615 - val_loss: 1.1439 - val_accuracy: 0.4634 Epoch 5/20 43/43 [==============================] - 72s 2s/step - loss: 1.0652 - accuracy: 0.5033 - val_loss: 1.1689 - val_accuracy: 0.4898 Epoch 6/20 43/43 [==============================] - 89s 2s/step - loss: 1.0402 - accuracy: 0.5363 - val_loss: 1.3159 - val_accuracy: 0.4178 Epoch 7/20 43/43 [==============================] - 72s 2s/step - loss: 1.0390 - accuracy: 0.5312 - val_loss: 1.1178 - val_accuracy: 0.5030 Epoch 8/20 43/43 [==============================] - 72s 2s/step - loss: 1.0008 - accuracy: 0.5627 - val_loss: 1.1731 - val_accuracy: 0.4778 Epoch 9/20 43/43 [==============================] - 72s 2s/step - loss: 1.0311 - accuracy: 0.5686 - val_loss: 1.1772 - val_accuracy: 0.4982 Epoch 10/20 43/43 [==============================] - 72s 2s/step - loss: 0.9216 - accuracy: 0.6097 - val_loss: 1.1827 - val_accuracy: 0.5234 Epoch 11/20 43/43 [==============================] - 72s 2s/step - loss: 0.9198 - accuracy: 0.6075 - val_loss: 1.2069 - val_accuracy: 0.4946 Epoch 12/20 43/43 [==============================] - 72s 2s/step - loss: 0.9137 - accuracy: 0.6060 - val_loss: 1.1931 - val_accuracy: 0.5090 Epoch 13/20 43/43 [==============================] - 72s 2s/step - loss: 0.8412 - accuracy: 0.6552 - val_loss: 1.0649 - val_accuracy: 0.6002 Epoch 14/20 43/43 [==============================] - 72s 2s/step - loss: 0.7539 - accuracy: 0.6845 - val_loss: 1.1287 - val_accuracy: 0.5378 Epoch 15/20 43/43 [==============================] - 71s 2s/step - loss: 0.7679 - accuracy: 0.6838 - val_loss: 1.0297 - val_accuracy: 0.5918 Epoch 16/20 43/43 [==============================] - 72s 2s/step - loss: 0.7082 - accuracy: 0.7205 - val_loss: 1.1068 - val_accuracy: 0.5822 Epoch 17/20 43/43 [==============================] - 72s 2s/step - loss: 0.6445 - accuracy: 0.7381 - val_loss: 1.0910 - val_accuracy: 0.6158 Epoch 18/20 43/43 [==============================] - 72s 2s/step - loss: 0.7353 - accuracy: 0.7087 - val_loss: 1.1732 - val_accuracy: 0.5450 Epoch 19/20 43/43 [==============================] - 89s 2s/step - loss: 0.6925 - accuracy: 0.7263 - val_loss: 1.0712 - val_accuracy: 0.6014 Epoch 20/20 43/43 [==============================] - 89s 2s/step - loss: 0.5481 - accuracy: 0.8019 - val_loss: 1.1721 - val_accuracy: 0.6230
print_results(model,un_shuffled_train_data, un_shuffled_validation_data)
plot_results(history_relu_adam_dropout_l2_test)
----->TRAIN
/usr/local/lib/python3.7/dist-packages/PIL/Image.py:960: UserWarning: Palette images with Transparency expressed in bytes should be converted to RGBA images "Palette images with Transparency expressed in bytes should be "
precision recall f1-score support
0 0.85 0.68 0.76 365
1 0.73 0.95 0.82 324
2 0.87 0.85 0.86 355
3 0.91 0.85 0.88 319
accuracy 0.83 1363
macro avg 0.84 0.83 0.83 1363
weighted avg 0.84 0.83 0.83 1363
---->Test
/usr/local/lib/python3.7/dist-packages/PIL/Image.py:960: UserWarning: Palette images with Transparency expressed in bytes should be converted to RGBA images "Palette images with Transparency expressed in bytes should be "
precision recall f1-score support
0 0.64 0.47 0.55 156
1 0.54 0.80 0.65 138
2 0.70 0.61 0.65 152
3 0.74 0.71 0.73 136
accuracy 0.64 582
macro avg 0.66 0.65 0.64 582
weighted avg 0.66 0.64 0.64 582
As I mentioned above and also I tested again here, by adam optimizer I did not get good results so I decided to use SGD as optimizer.
test_data.class_indices
{'bald_eagle': 0, 'elk': 1, 'racoon': 2, 'raven': 3}
Best Model
input = keras.layers.Input(shape=(100,100,1))
output = keras.layers.Flatten()(input)
output = keras.layers.Dense(4096, activation='relu')(output)
output = keras.layers.Dense(2048, activation='relu')(output)
output = keras.layers.Dense(4, activation='softmax')(output)
best_model = keras.models.Model(inputs=input, outputs=output)
best_model.compile(
optimizer=keras.optimizers.SGD(learning_rate=0.01, momentum=0.5),
loss='categorical_crossentropy',
metrics=['accuracy']
)
best_model.summary()
history_relu_bets = best_model.fit(train_data, validation_data=validation_data , epochs=20)
Model: "model_18"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
input_17 (InputLayer) [(None, 100, 100, 1)] 0
flatten_15 (Flatten) (None, 10000) 0
dense_47 (Dense) (None, 4096) 40964096
dense_48 (Dense) (None, 2048) 8390656
dense_49 (Dense) (None, 4) 8196
=================================================================
Total params: 49,362,948
Trainable params: 49,362,948
Non-trainable params: 0
_________________________________________________________________
Epoch 1/20
10/43 [=====>........................] - ETA: 31s - loss: 2.4855 - accuracy: 0.2844
/usr/local/lib/python3.7/dist-packages/PIL/Image.py:960: UserWarning: Palette images with Transparency expressed in bytes should be converted to RGBA images "Palette images with Transparency expressed in bytes should be "
43/43 [==============================] - 59s 1s/step - loss: 1.5774 - accuracy: 0.3720 - val_loss: 1.2137 - val_accuracy: 0.4880 Epoch 2/20 43/43 [==============================] - 58s 1s/step - loss: 1.1714 - accuracy: 0.4982 - val_loss: 1.2173 - val_accuracy: 0.4656 Epoch 3/20 43/43 [==============================] - 58s 1s/step - loss: 1.0688 - accuracy: 0.5444 - val_loss: 1.0626 - val_accuracy: 0.5361 Epoch 4/20 43/43 [==============================] - 63s 1s/step - loss: 1.0065 - accuracy: 0.6001 - val_loss: 0.9960 - val_accuracy: 0.6082 Epoch 5/20 43/43 [==============================] - 59s 1s/step - loss: 0.9603 - accuracy: 0.6222 - val_loss: 1.0551 - val_accuracy: 0.5704 Epoch 6/20 43/43 [==============================] - 59s 1s/step - loss: 0.9236 - accuracy: 0.6354 - val_loss: 0.9296 - val_accuracy: 0.6151 Epoch 7/20 43/43 [==============================] - 58s 1s/step - loss: 0.8314 - accuracy: 0.6853 - val_loss: 0.8779 - val_accuracy: 0.6564 Epoch 8/20 43/43 [==============================] - 58s 1s/step - loss: 0.8063 - accuracy: 0.6867 - val_loss: 0.9317 - val_accuracy: 0.5893 Epoch 9/20 43/43 [==============================] - 58s 1s/step - loss: 0.7770 - accuracy: 0.7087 - val_loss: 0.8956 - val_accuracy: 0.6151 Epoch 10/20 43/43 [==============================] - 58s 1s/step - loss: 0.7208 - accuracy: 0.7212 - val_loss: 0.8005 - val_accuracy: 0.7182 Epoch 11/20 43/43 [==============================] - 59s 1s/step - loss: 0.6662 - accuracy: 0.7616 - val_loss: 1.1362 - val_accuracy: 0.5344 Epoch 12/20 43/43 [==============================] - 58s 1s/step - loss: 0.6388 - accuracy: 0.7711 - val_loss: 0.8350 - val_accuracy: 0.6632 Epoch 13/20 43/43 [==============================] - 58s 1s/step - loss: 0.6061 - accuracy: 0.7902 - val_loss: 0.7881 - val_accuracy: 0.6890 Epoch 14/20 43/43 [==============================] - 58s 1s/step - loss: 0.5676 - accuracy: 0.8041 - val_loss: 0.7626 - val_accuracy: 0.7027 Epoch 15/20 43/43 [==============================] - 58s 1s/step - loss: 0.4842 - accuracy: 0.8511 - val_loss: 0.7782 - val_accuracy: 0.7251 Epoch 16/20 43/43 [==============================] - 58s 1s/step - loss: 0.5167 - accuracy: 0.8408 - val_loss: 1.0296 - val_accuracy: 0.5550 Epoch 17/20 43/43 [==============================] - 58s 1s/step - loss: 0.5425 - accuracy: 0.8225 - val_loss: 0.7388 - val_accuracy: 0.7268 Epoch 18/20 43/43 [==============================] - 58s 1s/step - loss: 0.3992 - accuracy: 0.8797 - val_loss: 0.6818 - val_accuracy: 0.7509 Epoch 19/20 43/43 [==============================] - 58s 1s/step - loss: 0.3909 - accuracy: 0.8701 - val_loss: 1.0573 - val_accuracy: 0.6460 Epoch 20/20 43/43 [==============================] - 59s 1s/step - loss: 0.4772 - accuracy: 0.8283 - val_loss: 0.7440 - val_accuracy: 0.7251
plot_results(history_relu_bets)
print_results(best_model,un_shuffled_train_data, un_shuffled_test_data)
----->TRAIN
/usr/local/lib/python3.7/dist-packages/PIL/Image.py:960: UserWarning: Palette images with Transparency expressed in bytes should be converted to RGBA images "Palette images with Transparency expressed in bytes should be "
precision recall f1-score support
0 0.97 0.84 0.90 365
1 0.88 1.00 0.94 324
2 1.00 0.89 0.94 355
3 0.83 0.95 0.89 319
accuracy 0.92 1363
macro avg 0.92 0.92 0.92 1363
weighted avg 0.93 0.92 0.92 1363
---->Test
/usr/local/lib/python3.7/dist-packages/PIL/Image.py:960: UserWarning: Palette images with Transparency expressed in bytes should be converted to RGBA images "Palette images with Transparency expressed in bytes should be "
precision recall f1-score support
0 0.81 0.58 0.67 223
1 0.69 0.90 0.78 198
2 0.84 0.63 0.72 217
3 0.64 0.83 0.72 195
accuracy 0.73 833
macro avg 0.74 0.73 0.73 833
weighted avg 0.75 0.73 0.72 833
def print_results(modell, train_generator, validation_generator):
print("----->TRAIN")
pred = np.argmax(modell.predict(train_generator, batch_size=32), axis=1)
print(classification_report(train_generator.labels, pred))
print("---->Test")
pred = np.argmax(modell.predict(validation_generator, batch_size=32), axis=1)
print(classification_report(validation_generator.labels, pred))
def plot_results(hist):
fig = plt.figure()
fig.set_figheight(5)
fig.set_figwidth(8)
plt.plot(hist.history["accuracy"], label = "Train")
plt.plot(hist.history["val_accuracy"], label ="Test")
plt.title("Accuracy of Train and validation Data")
plt.xlabel("Epoch")
plt.ylabel("Accuracy")
plt.legend()
plt.show()
fig = plt.figure()
fig.set_figheight(5)
fig.set_figwidth(8)
plt.plot(hist.history["loss"], label = "Train")
plt.plot(hist.history["val_loss"], Label = "Test")
plt.title("Loss of Train and Test Data")
plt.xlabel("Epoch")
plt.ylabel("Loss")
plt.legend()
plt.show()
labels_string = ["Bald Eagle", "Elk", "Raccoon", "Raven"]
test_pred = np.argmax(model.predict(un_shuffled_test_data), axis=1)
test_true_class = un_shuffled_test_data.labels
/usr/local/lib/python3.7/dist-packages/PIL/Image.py:960: UserWarning: Palette images with Transparency expressed in bytes should be converted to RGBA images "Palette images with Transparency expressed in bytes should be "
correct_predicted_images = 0
for images, labels in un_shuffled_test_data:
if correct_predicted_images >= 10:
break
for i in range(0, len(labels), 50):
if np.argmax(labels[i]) == test_pred[i]:
print(f"True Label = {labels_string[np.argmax(labels[i])]}, Predicted Label = {labels_string[test_pred[i]]}")
plt.imshow(images[i].reshape((100, 100)), cmap = 'gray')
plt.show()
correct_predicted_images += 1
/usr/local/lib/python3.7/dist-packages/PIL/Image.py:960: UserWarning: Palette images with Transparency expressed in bytes should be converted to RGBA images "Palette images with Transparency expressed in bytes should be "
True Label = Raccoon, Predicted Label = Raccoon
True Label = Raccoon, Predicted Label = Raccoon
True Label = Raccoon, Predicted Label = Raccoon
True Label = Raccoon, Predicted Label = Raccoon
True Label = Raccoon, Predicted Label = Raccoon
True Label = Raccoon, Predicted Label = Raccoon
True Label = Raccoon, Predicted Label = Raccoon
True Label = Raccoon, Predicted Label = Raccoon
True Label = Raccoon, Predicted Label = Raccoon
True Label = Raccoon, Predicted Label = Raccoon
wrong_predicted_images = 0
for images, labels in un_shuffled_test_data:
if wrong_predicted_images >= 10:
break
for i in range(len(labels)):
if np.argmax(labels[i]) != test_pred[i]:
print(f"True Label = {labels_string[np.argmax(labels[i])]}, Predicted Label = {labels_string[test_pred[i]]}")
plt.imshow(images[i].reshape((100, 100)), cmap = 'gray')
plt.show()
wrong_predicted_images += 1
True Label = Elk, Predicted Label = Raccoon
True Label = Elk, Predicted Label = Raccoon
True Label = Elk, Predicted Label = Raccoon
True Label = Elk, Predicted Label = Raccoon
True Label = Elk, Predicted Label = Raccoon
True Label = Elk, Predicted Label = Raccoon
True Label = Elk, Predicted Label = Raccoon
True Label = Elk, Predicted Label = Raccoon
True Label = Elk, Predicted Label = Raccoon
True Label = Elk, Predicted Label = Raccoon
True Label = Elk, Predicted Label = Bald Eagle
True Label = Elk, Predicted Label = Raccoon
True Label = Elk, Predicted Label = Raccoon
True Label = Elk, Predicted Label = Raccoon
True Label = Elk, Predicted Label = Raccoon
True Label = Elk, Predicted Label = Raccoon
True Label = Elk, Predicted Label = Raccoon
True Label = Elk, Predicted Label = Bald Eagle
True Label = Elk, Predicted Label = Raccoon
True Label = Elk, Predicted Label = Raccoon
True Label = Elk, Predicted Label = Raccoon
True Label = Elk, Predicted Label = Raccoon
True Label = Elk, Predicted Label = Bald Eagle
True Label = Elk, Predicted Label = Raccoon
True Label = Elk, Predicted Label = Bald Eagle
True Label = Elk, Predicted Label = Raccoon
True Label = Elk, Predicted Label = Bald Eagle
True Label = Elk, Predicted Label = Raccoon
True Label = Elk, Predicted Label = Raccoon
True Label = Elk, Predicted Label = Bald Eagle
True Label = Elk, Predicted Label = Bald Eagle
True Label = Elk, Predicted Label = Raccoon
In some cases there are more than one animal in the picture or the picture is not just about the animal, I mean there other objects other than the animal like trees and other elements in nature. Also, Raven and Bald Eagle are so much alike so in some cases the cause of wrong prediction is this.
An autoencoder is composed of an encoder and a decoder sub-models. The encoder compresses the input and the decoder attempts to recreate the input from the compressed version provided by the encoder. After training, the encoder model is saved and the decoder is discarded.
The encoder can then be used as a data preparation technique to perform feature extraction on raw data that can be used to train a different machine learning model.
input = layers.Input(shape = (100, 100, 1))
output = layers.Flatten()(input)
output = layers.Dense(4096 , activation = "LeakyReLU" , kernel_regularizer = regularizers.l2(l2 = 0.0001))(output)
output = layers.Dense(2048 , activation = "LeakyReLU" , kernel_regularizer = regularizers.l2(l2 = 0.0001))(output)
encoded = layers.Dense(2 , activation = "linear" , kernel_regularizer = regularizers.l2(l2 = 0.0001))(output)
decoded = layers.Dense(4 , activation = "softmax" , kernel_regularizer = regularizers.l2(l2 = 0.0001))(encoded)
auto_encoder_model = models.Model(inputs = input , outputs = decoded)
encoder_model = models.Model(inputs = input , outputs = encoded)
auto_encoder_model.compile(
optimizer=keras.optimizers.SGD(learning_rate=0.01, momentum = 0.5),
loss = "categorical_crossentropy",
metrics = ["accuracy"]
)
history_encode = auto_encoder_model.fit(train_data, validation_data = validation_data, epochs = 20)
Epoch 1/20 9/43 [=====>........................] - ETA: 36s - loss: 5.3926 - accuracy: 0.2361
/usr/local/lib/python3.7/dist-packages/PIL/Image.py:960: UserWarning: Palette images with Transparency expressed in bytes should be converted to RGBA images "Palette images with Transparency expressed in bytes should be "
43/43 [==============================] - 68s 2s/step - loss: 2.8540 - accuracy: 0.3397 - val_loss: 2.1336 - val_accuracy: 0.4175 Epoch 2/20 43/43 [==============================] - 65s 2s/step - loss: 2.0975 - accuracy: 0.4314 - val_loss: 2.0892 - val_accuracy: 0.4416 Epoch 3/20 43/43 [==============================] - 65s 2s/step - loss: 2.0436 - accuracy: 0.4505 - val_loss: 2.1364 - val_accuracy: 0.3522 Epoch 4/20 43/43 [==============================] - 65s 2s/step - loss: 2.0003 - accuracy: 0.4806 - val_loss: 1.9988 - val_accuracy: 0.4759 Epoch 5/20 43/43 [==============================] - 65s 2s/step - loss: 1.9361 - accuracy: 0.5172 - val_loss: 2.0738 - val_accuracy: 0.3900 Epoch 6/20 43/43 [==============================] - 67s 2s/step - loss: 1.9294 - accuracy: 0.5363 - val_loss: 1.9380 - val_accuracy: 0.5275 Epoch 7/20 43/43 [==============================] - 68s 2s/step - loss: 1.9035 - accuracy: 0.5459 - val_loss: 1.9906 - val_accuracy: 0.4897 Epoch 8/20 43/43 [==============================] - 67s 2s/step - loss: 1.8485 - accuracy: 0.5767 - val_loss: 2.0031 - val_accuracy: 0.4519 Epoch 9/20 43/43 [==============================] - 67s 2s/step - loss: 1.8363 - accuracy: 0.5906 - val_loss: 1.9119 - val_accuracy: 0.5326 Epoch 10/20 43/43 [==============================] - 68s 2s/step - loss: 1.7677 - accuracy: 0.6280 - val_loss: 1.9412 - val_accuracy: 0.5189 Epoch 11/20 43/43 [==============================] - 67s 2s/step - loss: 1.7984 - accuracy: 0.5928 - val_loss: 1.8340 - val_accuracy: 0.5825 Epoch 12/20 43/43 [==============================] - 68s 2s/step - loss: 1.7389 - accuracy: 0.6376 - val_loss: 2.0038 - val_accuracy: 0.5172 Epoch 13/20 43/43 [==============================] - 67s 2s/step - loss: 1.7772 - accuracy: 0.6222 - val_loss: 1.9657 - val_accuracy: 0.5206 Epoch 14/20 43/43 [==============================] - 66s 2s/step - loss: 1.6267 - accuracy: 0.7021 - val_loss: 1.7461 - val_accuracy: 0.6495 Epoch 15/20 43/43 [==============================] - 67s 2s/step - loss: 1.6800 - accuracy: 0.6684 - val_loss: 1.9268 - val_accuracy: 0.5447 Epoch 16/20 43/43 [==============================] - 68s 2s/step - loss: 1.5897 - accuracy: 0.7073 - val_loss: 1.7647 - val_accuracy: 0.5945 Epoch 17/20 43/43 [==============================] - 67s 2s/step - loss: 1.6162 - accuracy: 0.6787 - val_loss: 1.9759 - val_accuracy: 0.5361 Epoch 18/20 43/43 [==============================] - 67s 2s/step - loss: 1.6071 - accuracy: 0.7153 - val_loss: 1.6924 - val_accuracy: 0.6753 Epoch 19/20 43/43 [==============================] - 67s 2s/step - loss: 1.5202 - accuracy: 0.7381 - val_loss: 1.7306 - val_accuracy: 0.6529 Epoch 20/20 43/43 [==============================] - 67s 2s/step - loss: 1.6531 - accuracy: 0.7124 - val_loss: 1.8054 - val_accuracy: 0.6237
encoder_pred_train = encoder_model.predict(un_shuffled_train_data, batch_size = 32)
encoder_pred_test = encoder_model.predict(un_shuffled_validation_data, batch_size = 32)
/usr/local/lib/python3.7/dist-packages/PIL/Image.py:960: UserWarning: Palette images with Transparency expressed in bytes should be converted to RGBA images "Palette images with Transparency expressed in bytes should be "
colors = ["pink" , "blue" , "yellow" , "purple"]
for i in range(len(encoder_pred_train)):
plt.scatter(encoder_pred_train[i][0], encoder_pred_train[i][1], marker='o', color = colors[un_shuffled_train_data.classes[i]])
plt.title("Train Data")
plt.colorbar()
plt.show()
for i in range(len(encoder_pred_test)):
plt.scatter(encoder_pred_test[i][0], encoder_pred_test[i][1], marker='o', color = colors[un_shuffled_validation_data.classes[i]])
plt.title("Test Data")
plt.colorbar()
plt.show()
Here I showed each class as following colors:
Blad Eagle => pink
Elk => blue
Raccoon => yellow
Raven => purple
As we can see from the plots above our model can understand the difference between raven and raccoon better, but for instance the differnec between raven and bald eagle is not that much noticable. Totally when the plots are not that much in common, classifying is easier. Also, at it is expected, the plot of train data is so much better and distinctable than the plot of test data.
In this assignment I got familiar with keras and TensorFlow. Also, it helped me to get deeper in neural network concepts. I learned abour different hyperparamters and how parameters like regularization and optimizers can help us te create a better model. Now I have a better understanding of how differrent factors can effect on our model and how prevent overfitting.